{
  "version": "2.0",
  "service": "<p>Amazon Web Services HealthOmics is a service that helps users such as bioinformaticians, researchers, and scientists to store, query, analyze, and generate insights from genomics and other biological data. It simplifies and accelerates the process of storing and analyzing genomic information for Amazon Web Services.</p> <p>For an introduction to the service, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/what-is-healthomics.html\">What is Amazon Web Services HealthOmics?</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
  "operations": {
    "AbortMultipartReadSetUpload": "<p>Stops a multipart read set upload into a sequence store and returns a response with no body if the operation is successful. To confirm that a multipart read set upload has been stopped, use the <code>ListMultipartReadSetUploads</code> API operation to view all active multipart read set uploads.</p>",
    "AcceptShare": "<p>Accept a resource share request.</p>",
    "BatchDeleteReadSet": "<p>Deletes one or more read sets. If the operation is successful, it returns a response with no body. If there is an error with deleting one of the read sets, the operation returns an error list. If the operation successfully deletes only a subset of files, it will return an error list for the remaining files that fail to be deleted. There is a limit of 100 read sets that can be deleted in each <code>BatchDeleteReadSet</code> API call.</p>",
    "CancelAnnotationImportJob": "<p>Cancels an annotation import job.</p>",
    "CancelRun": "<p>Cancels a run using its ID and returns a response with no body if the operation is successful. To confirm that the run has been cancelled, use the <code>ListRuns</code> API operation to check that it is no longer listed.</p>",
    "CancelVariantImportJob": "<p>Cancels a variant import job.</p>",
    "CompleteMultipartReadSetUpload": "<p>Completes a multipart read set upload into a sequence store after you have initiated the upload process with <code>CreateMultipartReadSetUpload</code> and uploaded all read set parts using <code>UploadReadSetPart</code>. You must specify the parts you uploaded using the parts parameter. If the operation is successful, it returns the read set ID(s) of the uploaded read set(s).</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/synchronous-uploads.html\">Direct upload to a sequence store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateAnnotationStore": "<p>Creates an annotation store.</p>",
    "CreateAnnotationStoreVersion": "<p> Creates a new version of an annotation store. </p>",
    "CreateMultipartReadSetUpload": "<p>Initiates a multipart read set upload for uploading partitioned source files into a sequence store. You can directly import source files from an EC2 instance and other local compute, or from an S3 bucket. To separate these source files into parts, use the <code>split</code> operation. Each part cannot be larger than 100 MB. If the operation is successful, it provides an <code>uploadId</code> which is required by the <code>UploadReadSetPart</code> API operation to upload parts into a sequence store.</p> <p>To continue uploading a multipart read set into your sequence store, you must use the <code>UploadReadSetPart</code> API operation to upload each part individually following the steps below:</p> <ul> <li> <p>Specify the <code>uploadId</code> obtained from the previous call to <code>CreateMultipartReadSetUpload</code>.</p> </li> <li> <p>Upload parts for that <code>uploadId</code>.</p> </li> </ul> <p>When you have finished uploading parts, use the <code>CompleteMultipartReadSetUpload</code> API to complete the multipart read set upload and to retrieve the final read set IDs in the response.</p> <p>To learn more about creating parts and the <code>split</code> operation, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/synchronous-uploads.html\">Direct upload to a sequence store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateReferenceStore": "<p>Creates a reference store and returns metadata in JSON format. Reference stores are used to store reference genomes in FASTA format. A reference store is created when the first reference genome is imported. To import additional reference genomes from an Amazon S3 bucket, use the <code>StartReferenceImportJob</code> API operation. </p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-reference-store.html\">Creating a HealthOmics reference store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateRunCache": "<p>Creates a run cache to store and reference task outputs from completed private runs. Specify an Amazon S3 location where Amazon Web Services HealthOmics saves the cached data. This data must be immediately accessible and not in an archived state. You can save intermediate task files to a run cache if they are declared as task outputs in the workflow definition file.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-call-caching.html\">Call caching</a> and <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-cache-create.html\">Creating a run cache</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateRunGroup": "<p>Creates a run group to limit the compute resources for the runs that are added to the group. Returns an ARN, ID, and tags for the run group.</p>",
    "CreateSequenceStore": "<p>Creates a sequence store and returns its metadata. Sequence stores are used to store sequence data files called read sets that are saved in FASTQ, BAM, uBAM, or CRAM formats. For aligned formats (BAM and CRAM), a sequence store can only use one reference genome. For unaligned formats (FASTQ and uBAM), a reference genome is not required. You can create multiple sequence stores per region per account. </p> <p>The following are optional parameters you can specify for your sequence store:</p> <ul> <li> <p>Use <code>s3AccessConfig</code> to configure your sequence store with S3 access logs (recommended).</p> </li> <li> <p>Use <code>sseConfig</code> to define your own KMS key for encryption.</p> </li> <li> <p>Use <code>eTagAlgorithmFamily</code> to define which algorithm to use for the HealthOmics eTag on objects.</p> </li> <li> <p>Use <code>fallbackLocation</code> to define a backup location for storing files that have failed a direct upload.</p> </li> <li> <p>Use <code>propagatedSetLevelTags</code> to configure tags that propagate to all objects in your store.</p> </li> </ul> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-sequence-store.html\">Creating a HealthOmics sequence store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateShare": "<p>Creates a cross-account shared resource. The resource owner makes an offer to share the resource with the principal subscriber (an AWS user with a different account than the resource owner).</p> <p>The following resources support cross-account sharing:</p> <ul> <li> <p>HealthOmics variant stores</p> </li> <li> <p>HealthOmics annotation stores</p> </li> <li> <p>Private workflows</p> </li> </ul>",
    "CreateVariantStore": "<p>Creates a variant store.</p>",
    "CreateWorkflow": "<p>Creates a private workflow. Before you create a private workflow, you must create and configure these required resources:</p> <ul> <li> <p> <i>Workflow definition files</i>: Define your workflow in one or more workflow definition files, written in WDL, Nextflow, or CWL. The workflow definition specifies the inputs and outputs for runs that use the workflow. It also includes specifications for the runs and run tasks for your workflow, including compute and memory requirements. The workflow definition file must be in .zip format.</p> </li> <li> <p>(Optional) <i>Parameter template</i>: You can create a parameter template file that defines the run parameters, or Amazon Web Services HealthOmics can generate the parameter template for you.</p> </li> <li> <p> <i>ECR container images</i>: Create container images for the workflow in a private ECR repository, or synchronize images from a supported upstream registry with your Amazon ECR private repository.</p> </li> <li> <p>(Optional) <i>Sentieon licenses</i>: Request a Sentieon license if using the Sentieon software in a private workflow.</p> </li> </ul> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/creating-private-workflows.html\">Creating or updating a private workflow in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "CreateWorkflowVersion": "<p>Creates a new workflow version for the workflow that you specify with the <code>workflowId</code> parameter.</p> <p>When you create a new version of a workflow, you need to specify the configuration for the new version. It doesn't inherit any configuration values from the workflow.</p> <p>Provide a version name that is unique for this workflow. You cannot change the name after HealthOmics creates the version.</p> <note> <p>Don't include any personally identifiable information (PII) in the version name. Version names appear in the workflow version ARN.</p> </note> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "DeleteAnnotationStore": "<p>Deletes an annotation store.</p>",
    "DeleteAnnotationStoreVersions": "<p> Deletes one or multiple versions of an annotation store. </p>",
    "DeleteReference": "<p>Deletes a reference genome and returns a response with no body if the operation is successful. The read set associated with the reference genome must first be deleted before deleting the reference genome. After the reference genome is deleted, you can delete the reference store using the <code>DeleteReferenceStore</code> API operation.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/deleting-reference-and-sequence-stores.html\">Deleting HealthOmics reference and sequence stores</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "DeleteReferenceStore": "<p>Deletes a reference store and returns a response with no body if the operation is successful. You can only delete a reference store when it does not contain any reference genomes. To empty a reference store, use <code>DeleteReference</code>.</p> <p>For more information about your workflow status, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/deleting-reference-and-sequence-stores.html\">Deleting HealthOmics reference and sequence stores</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "DeleteRun": "<p>Deletes a run and returns a response with no body if the operation is successful. You can only delete a run that has reached a <code>COMPLETED</code>, <code>FAILED</code>, or <code>CANCELLED</code> stage. A completed run has delivered an output, or was cancelled and resulted in no output. When you delete a run, only the metadata associated with the run is deleted. The run outputs remain in Amazon S3 and logs remain in CloudWatch.</p> <p>To verify that the workflow is deleted:</p> <ul> <li> <p>Use <code>ListRuns</code> to confirm the workflow no longer appears in the list.</p> </li> <li> <p>Use <code>GetRun</code> to verify the workflow cannot be found.</p> </li> </ul>",
    "DeleteRunCache": "<p>Deletes a run cache and returns a response with no body if the operation is successful. This action removes the cache metadata stored in the service account, but does not delete the data in Amazon S3. You can access the cache data in Amazon S3, for inspection or to troubleshoot issues. You can remove old cache data using standard S3 <code>Delete</code> operations. </p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-cache-delete.html\">Deleting a run cache</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "DeleteRunGroup": "<p>Deletes a run group and returns a response with no body if the operation is successful.</p> <p>To verify that the run group is deleted:</p> <ul> <li> <p>Use <code>ListRunGroups</code> to confirm the workflow no longer appears in the list.</p> </li> <li> <p>Use <code>GetRunGroup</code> to verify the workflow cannot be found.</p> </li> </ul>",
    "DeleteS3AccessPolicy": "<p>Deletes an access policy for the specified store.</p>",
    "DeleteSequenceStore": "<p>Deletes a sequence store and returns a response with no body if the operation is successful. You can only delete a sequence store when it does not contain any read sets.</p> <p>Use the <code>BatchDeleteReadSet</code> API operation to ensure that all read sets in the sequence store are deleted. When a sequence store is deleted, all tags associated with the store are also deleted.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/deleting-reference-and-sequence-stores.html\">Deleting HealthOmics reference and sequence stores</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "DeleteShare": "<p>Deletes a resource share. If you are the resource owner, the subscriber will no longer have access to the shared resource. If you are the subscriber, this operation deletes your access to the share.</p>",
    "DeleteVariantStore": "<p>Deletes a variant store.</p>",
    "DeleteWorkflow": "<p>Deletes a workflow by specifying its ID. This operation returns a response with no body if the deletion is successful.</p> <p>To verify that the workflow is deleted:</p> <ul> <li> <p>Use <code>ListWorkflows</code> to confirm the workflow no longer appears in the list.</p> </li> <li> <p>Use <code>GetWorkflow</code> to verify the workflow cannot be found.</p> </li> </ul>",
    "DeleteWorkflowVersion": "<p>Deletes a workflow version. Deleting a workflow version doesn't affect any ongoing runs that are using the workflow version.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "GetAnnotationImportJob": "<p>Gets information about an annotation import job.</p>",
    "GetAnnotationStore": "<p>Gets information about an annotation store.</p>",
    "GetAnnotationStoreVersion": "<p> Retrieves the metadata for an annotation store version. </p>",
    "GetReadSet": "<p>Retrieves detailed information from parts of a read set and returns the read set in the same format that it was uploaded. You must have read sets uploaded to your sequence store in order to run this operation.</p>",
    "GetReadSetActivationJob": "<p>Returns detailed information about the status of a read set activation job in JSON format.</p>",
    "GetReadSetExportJob": "<p>Retrieves status information about a read set export job and returns the data in JSON format. Use this operation to actively monitor the progress of an export job.</p>",
    "GetReadSetImportJob": "<p>Gets detailed and status information about a read set import job and returns the data in JSON format.</p>",
    "GetReadSetMetadata": "<p>Retrieves the metadata for a read set from a sequence store in JSON format. This operation does not return tags. To retrieve the list of tags for a read set, use the <code>ListTagsForResource</code> API operation.</p>",
    "GetReference": "<p>Downloads parts of data from a reference genome and returns the reference file in the same format that it was uploaded.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-reference-store.html\">Creating a HealthOmics reference store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "GetReferenceImportJob": "<p>Monitors the status of a reference import job. This operation can be called after calling the <code>StartReferenceImportJob</code> operation.</p>",
    "GetReferenceMetadata": "<p>Retrieves metadata for a reference genome. This operation returns the number of parts, part size, and MD5 of an entire file. This operation does not return tags. To retrieve the list of tags for a read set, use the <code>ListTagsForResource</code> API operation.</p>",
    "GetReferenceStore": "<p>Gets information about a reference store.</p>",
    "GetRun": "<p>Gets detailed information about a specific run using its ID.</p> <p>Amazon Web Services HealthOmics stores a configurable number of runs, as determined by service limits, that are available to the console and API. If <code>GetRun</code> does not return the requested run, you can find all run logs in the CloudWatch logs. For more information about viewing the run logs, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/monitoring-cloudwatch-logs.html\">CloudWatch logs</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "GetRunCache": "<p>Retrieves detailed information about the specified run cache using its ID.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-call-caching.html\">Call caching for Amazon Web Services HealthOmics runs</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "GetRunGroup": "<p>Gets information about a run group and returns its metadata.</p>",
    "GetRunTask": "<p>Gets detailed information about a run task using its ID.</p>",
    "GetS3AccessPolicy": "<p>Retrieves details about an access policy on a given store.</p>",
    "GetSequenceStore": "<p>Retrieves metadata for a sequence store using its ID and returns it in JSON format.</p>",
    "GetShare": "<p>Retrieves the metadata for the specified resource share.</p>",
    "GetVariantImportJob": "<p>Gets information about a variant import job.</p>",
    "GetVariantStore": "<p>Gets information about a variant store.</p>",
    "GetWorkflow": "<p>Gets all information about a workflow using its ID.</p> <p>If a workflow is shared with you, you cannot export the workflow.</p> <p>For more information about your workflow status, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/using-get-workflow.html\">Verify the workflow status</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "GetWorkflowVersion": "<p>Gets information about a workflow version. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListAnnotationImportJobs": "<p>Retrieves a list of annotation import jobs.</p>",
    "ListAnnotationStoreVersions": "<p> Lists the versions of an annotation store. </p>",
    "ListAnnotationStores": "<p>Retrieves a list of annotation stores.</p>",
    "ListMultipartReadSetUploads": "<p>Lists in-progress multipart read set uploads for a sequence store and returns it in a JSON formatted output. Multipart read set uploads are initiated by the <code>CreateMultipartReadSetUploads</code> API operation. This operation returns a response with no body when the upload is complete. </p>",
    "ListReadSetActivationJobs": "<p>Retrieves a list of read set activation jobs and returns the metadata in a JSON formatted output. To extract metadata from a read set activation job, use the <code>GetReadSetActivationJob</code> API operation.</p>",
    "ListReadSetExportJobs": "<p>Retrieves a list of read set export jobs in a JSON formatted response. This API operation is used to check the status of a read set export job initiated by the <code>StartReadSetExportJob</code> API operation.</p>",
    "ListReadSetImportJobs": "<p>Retrieves a list of read set import jobs and returns the data in JSON format.</p>",
    "ListReadSetUploadParts": "<p>Lists all parts in a multipart read set upload for a sequence store and returns the metadata in a JSON formatted output.</p>",
    "ListReadSets": "<p>Retrieves a list of read sets from a sequence store ID and returns the metadata in JSON format.</p>",
    "ListReferenceImportJobs": "<p>Retrieves the metadata of one or more reference import jobs for a reference store.</p>",
    "ListReferenceStores": "<p>Retrieves a list of reference stores linked to your account and returns their metadata in JSON format.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-reference-store.html\">Creating a reference store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListReferences": "<p>Retrieves the metadata of one or more reference genomes in a reference store.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-reference-store.html\">Creating a reference store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListRunCaches": "<p>Retrieves a list of your run caches and the metadata for each cache.</p>",
    "ListRunGroups": "<p>Retrieves a list of all run groups and returns the metadata for each run group.</p>",
    "ListRunTasks": "<p>Returns a list of tasks and status information within their specified run. Use this operation to monitor runs and to identify which specific tasks have failed.</p>",
    "ListRuns": "<p>Retrieves a list of runs and returns each run's metadata and status.</p> <p>Amazon Web Services HealthOmics stores a configurable number of runs, as determined by service limits, that are available to the console and API. If the <code>ListRuns</code> response doesn't include specific runs that you expected, you can find all run logs in the CloudWatch logs. For more information about viewing the run logs, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/monitoring-cloudwatch-logs.html\">CloudWatch logs</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListSequenceStores": "<p>Retrieves a list of sequence stores and returns each sequence store's metadata.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/create-sequence-store.html\">Creating a HealthOmics sequence store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListShares": "<p>Retrieves the resource shares associated with an account. Use the filter parameter to retrieve a specific subset of the shares.</p>",
    "ListTagsForResource": "<p>Retrieves a list of tags for a resource.</p>",
    "ListVariantImportJobs": "<p>Retrieves a list of variant import jobs.</p>",
    "ListVariantStores": "<p>Retrieves a list of variant stores.</p>",
    "ListWorkflowVersions": "<p>Lists the workflow versions for the specified workflow. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "ListWorkflows": "<p>Retrieves a list of existing workflows. You can filter for specific workflows by their name and type. Using the type parameter, specify <code>PRIVATE</code> to retrieve a list of private workflows or specify <code>READY2RUN</code> for a list of all Ready2Run workflows. If you do not specify the type of workflow, this operation returns a list of existing workflows.</p>",
    "PutS3AccessPolicy": "<p>Adds an access policy to the specified store.</p>",
    "StartAnnotationImportJob": "<p>Starts an annotation import job.</p>",
    "StartReadSetActivationJob": "<p>Activates an archived read set and returns its metadata in a JSON formatted output. AWS HealthOmics automatically archives unused read sets after 30 days. To monitor the status of your read set activation job, use the <code>GetReadSetActivationJob</code> operation.</p> <p>To learn more, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/activating-read-sets.html\">Activating read sets</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "StartReadSetExportJob": "<p>Starts a read set export job. When the export job is finished, the read set is exported to an Amazon S3 bucket which can be retrieved using the <code>GetReadSetExportJob</code> API operation.</p> <p>To monitor the status of the export job, use the <code>ListReadSetExportJobs</code> API operation. </p>",
    "StartReadSetImportJob": "<p>Imports a read set from the sequence store. Read set import jobs support a maximum of 100 read sets of different types. Monitor the progress of your read set import job by calling the <code>GetReadSetImportJob</code> API operation.</p>",
    "StartReferenceImportJob": "<p>Imports a reference genome from Amazon S3 into a specified reference store. You can have multiple reference genomes in a reference store. You can only import reference genomes one at a time into each reference store. Monitor the status of your reference import job by using the <code>GetReferenceImportJob</code> API operation.</p>",
    "StartRun": "<p>Starts a new run and returns details about the run, or duplicates an existing run. A run is a single invocation of a workflow. If you provide request IDs, Amazon Web Services HealthOmics identifies duplicate requests and starts the run only once. Monitor the progress of the run by calling the <code>GetRun</code> API operation.</p> <p>To start a new run, the following inputs are required:</p> <ul> <li> <p>A service role ARN (<code>roleArn</code>).</p> </li> <li> <p>The run's workflow ID (<code>workflowId</code>, not the <code>uuid</code> or <code>runId</code>).</p> </li> <li> <p>An Amazon S3 location (<code>outputUri</code>) where the run outputs will be saved.</p> </li> <li> <p>All required workflow parameters (<code>parameter</code>), which can include optional parameters from the parameter template. The run cannot include any parameters that are not defined in the parameter template. To see all possible parameters, use the <code>GetRun</code> API operation. </p> </li> <li> <p>For runs with a <code>STATIC</code> (default) storage type, specify the required storage capacity (in gibibytes). A storage capacity value is not required for runs that use <code>DYNAMIC</code> storage.</p> </li> </ul> <p> <code>StartRun</code> can also duplicate an existing run using the run's default values. You can modify these default values and/or add other optional inputs. To duplicate a run, the following inputs are required:</p> <ul> <li> <p>A service role ARN (<code>roleArn</code>).</p> </li> <li> <p>The ID of the run to duplicate (<code>runId</code>).</p> </li> <li> <p>An Amazon S3 location where the run outputs will be saved (<code>outputUri</code>).</p> </li> </ul> <p>To learn more about the optional parameters for <code>StartRun</code>, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/starting-a-run.html\">Starting a run</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p> <p>Use the <code>retentionMode</code> input to control how long the metadata for each run is stored in CloudWatch. There are two retention modes:</p> <ul> <li> <p>Specify <code>REMOVE</code> to automatically remove the oldest runs when you reach the maximum service retention limit for runs. It is recommended that you use the <code>REMOVE</code> mode to initiate major run requests so that your runs do not fail when you reach the limit.</p> </li> <li> <p>The <code>retentionMode</code> is set to the <code>RETAIN</code> mode by default, which allows you to manually remove runs after reaching the maximum service retention limit. Under this setting, you cannot create additional runs until you remove the excess runs.</p> </li> </ul> <p>To learn more about the retention modes, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/run-retention.html\">Run retention mode</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "StartVariantImportJob": "<p>Starts a variant import job.</p>",
    "TagResource": "<p>Tags a resource.</p>",
    "UntagResource": "<p>Removes tags from a resource.</p>",
    "UpdateAnnotationStore": "<p>Updates an annotation store.</p>",
    "UpdateAnnotationStoreVersion": "<p> Updates the description of an annotation store version. </p>",
    "UpdateRunCache": "<p>Updates a run cache using its ID and returns a response with no body if the operation is successful. You can update the run cache description, name, or the default run cache behavior with <code>CACHE_ON_FAILURE</code> or <code>CACHE_ALWAYS</code>. To confirm that your run cache settings have been properly updated, use the <code>GetRunCache</code> API operation.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/how-run-cache.html\">How call caching works</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "UpdateRunGroup": "<p>Updates the settings of a run group and returns a response with no body if the operation is successful.</p> <p>You can update the following settings with <code>UpdateRunGroup</code>:</p> <ul> <li> <p>Maximum number of CPUs</p> </li> <li> <p>Run time (measured in minutes)</p> </li> <li> <p>Number of GPUs</p> </li> <li> <p>Number of concurrent runs</p> </li> <li> <p>Group name</p> </li> </ul> <p>To confirm that the settings have been successfully updated, use the <code>ListRunGroups</code> or <code>GetRunGroup</code> API operations to verify that the desired changes have been made.</p>",
    "UpdateSequenceStore": "<p>Update one or more parameters for the sequence store.</p>",
    "UpdateVariantStore": "<p>Updates a variant store.</p>",
    "UpdateWorkflow": "<p>Updates information about a workflow.</p> <p>You can update the following workflow information:</p> <ul> <li> <p>Name</p> </li> <li> <p>Description</p> </li> <li> <p>Default storage type</p> </li> <li> <p>Default storage capacity (with workflow ID)</p> </li> </ul> <p>This operation returns a response with no body if the operation is successful. You can check the workflow updates by calling the <code>GetWorkflow</code> API operation.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/update-private-workflow.html\">Update a private workflow</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "UpdateWorkflowVersion": "<p>Updates information about the workflow version. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning in Amazon Web Services HealthOmics</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
    "UploadReadSetPart": "<p>Uploads a specific part of a read set into a sequence store. When you a upload a read set part with a part number that already exists, the new part replaces the existing one. This operation returns a JSON formatted response containing a string identifier that is used to confirm that parts are being added to the intended upload.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/synchronous-uploads.html\">Direct upload to a sequence store</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>"
  },
  "shapes": {
    "AbortMultipartReadSetUploadRequest": {
      "base": null,
      "refs": {
      }
    },
    "AbortMultipartReadSetUploadResponse": {
      "base": null,
      "refs": {
      }
    },
    "Accelerators": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$accelerators": "<p>The computational accelerator specified to run the workflow.</p>",
        "CreateWorkflowVersionRequest$accelerators": "<p>The computational accelerator for this workflow version.</p>",
        "GetRunResponse$accelerators": "<p>The computational accelerator used to run the workflow.</p>",
        "GetWorkflowResponse$accelerators": "<p>The computational accelerator specified to run the workflow. </p>",
        "GetWorkflowVersionResponse$accelerators": "<p>The accelerator for this workflow version.</p>"
      }
    },
    "AcceptShareRequest": {
      "base": null,
      "refs": {
      }
    },
    "AcceptShareResponse": {
      "base": null,
      "refs": {
      }
    },
    "AccessDeniedException": {
      "base": "<p>You do not have sufficient access to perform this action.</p>",
      "refs": {
      }
    },
    "AccessLogLocation": {
      "base": null,
      "refs": {
        "S3AccessConfig$accessLogLocation": "<p>Location of the access logs.</p>",
        "SequenceStoreS3Access$accessLogLocation": "<p>Location of the access logs.</p>"
      }
    },
    "ActivateReadSetFilter": {
      "base": "<p>A read set activation job filter.</p>",
      "refs": {
        "ListReadSetActivationJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ActivateReadSetJobItem": {
      "base": "<p>A read set activation job.</p>",
      "refs": {
        "ActivateReadSetJobList$member": null
      }
    },
    "ActivateReadSetJobList": {
      "base": null,
      "refs": {
        "ListReadSetActivationJobsResponse$activationJobs": "<p>A list of jobs.</p>"
      }
    },
    "ActivateReadSetSourceItem": {
      "base": "<p>A source for a read set activation job.</p>",
      "refs": {
        "ActivateReadSetSourceList$member": null
      }
    },
    "ActivateReadSetSourceList": {
      "base": null,
      "refs": {
        "GetReadSetActivationJobResponse$sources": "<p>The job's source files.</p>"
      }
    },
    "ActivationJobId": {
      "base": null,
      "refs": {
        "ActivateReadSetJobItem$id": "<p>The job's ID.</p>",
        "GetReadSetActivationJobRequest$id": "<p>The job's ID.</p>",
        "GetReadSetActivationJobResponse$id": "<p>The job's ID.</p>",
        "StartReadSetActivationJobResponse$id": "<p>The job's ID.</p>"
      }
    },
    "AnnotationFieldMap": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$annotationFields": "<p> The annotation schema generated by the parsed annotation data. </p>",
        "GetAnnotationImportResponse$annotationFields": "<p>The annotation schema generated by the parsed annotation data.</p>",
        "GetVariantImportResponse$annotationFields": "<p>The annotation schema generated by the parsed annotation data.</p>",
        "StartAnnotationImportRequest$annotationFields": "<p>The annotation schema generated by the parsed annotation data.</p>",
        "StartVariantImportRequest$annotationFields": "<p>The annotation schema generated by the parsed annotation data.</p>",
        "VariantImportJobItem$annotationFields": "<p> The annotation schema generated by the parsed annotation data. </p>"
      }
    },
    "AnnotationFieldMapKeyString": {
      "base": null,
      "refs": {
        "AnnotationFieldMap$key": null
      }
    },
    "AnnotationFieldMapValueString": {
      "base": null,
      "refs": {
        "AnnotationFieldMap$value": null
      }
    },
    "AnnotationImportItemDetail": {
      "base": "<p>Details about an imported annotation item.</p>",
      "refs": {
        "AnnotationImportItemDetails$member": null
      }
    },
    "AnnotationImportItemDetails": {
      "base": null,
      "refs": {
        "GetAnnotationImportResponse$items": "<p>The job's imported items.</p>"
      }
    },
    "AnnotationImportItemSource": {
      "base": "<p>A source for an annotation import job.</p>",
      "refs": {
        "AnnotationImportItemSources$member": null
      }
    },
    "AnnotationImportItemSources": {
      "base": null,
      "refs": {
        "StartAnnotationImportRequest$items": "<p>Items to import.</p>"
      }
    },
    "AnnotationImportJobItem": {
      "base": "<p>An annotation import job.</p>",
      "refs": {
        "AnnotationImportJobItems$member": null
      }
    },
    "AnnotationImportJobItems": {
      "base": null,
      "refs": {
        "ListAnnotationImportJobsResponse$annotationImportJobs": "<p>A list of jobs.</p>"
      }
    },
    "AnnotationStoreItem": {
      "base": "<p>An annotation store.</p>",
      "refs": {
        "AnnotationStoreItems$member": null
      }
    },
    "AnnotationStoreItems": {
      "base": null,
      "refs": {
        "ListAnnotationStoresResponse$annotationStores": "<p>A list of stores.</p>"
      }
    },
    "AnnotationStoreVersionItem": {
      "base": "<p> Annotation store versions. </p>",
      "refs": {
        "AnnotationStoreVersionItems$member": null
      }
    },
    "AnnotationStoreVersionItems": {
      "base": null,
      "refs": {
        "ListAnnotationStoreVersionsResponse$annotationStoreVersions": "<p> Lists all versions of an annotation store. </p>"
      }
    },
    "AnnotationType": {
      "base": null,
      "refs": {
        "TsvStoreOptions$annotationType": "<p>The store's annotation type.</p>",
        "TsvVersionOptions$annotationType": "<p> The store version's annotation type. </p>"
      }
    },
    "Arn": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$roleArn": "<p>The job's service role ARN.</p>",
        "AnnotationStoreItem$storeArn": "<p>The store's ARN.</p>",
        "AnnotationStoreVersionItem$versionArn": "<p> The Arn for an annotation store version. </p>",
        "GetAnnotationImportResponse$roleArn": "<p>The job's service role ARN.</p>",
        "GetAnnotationStoreResponse$storeArn": "<p>The store's ARN.</p>",
        "GetAnnotationStoreVersionResponse$versionArn": "<p> The Arn for the annotation store. </p>",
        "GetVariantImportResponse$roleArn": "<p>The job's service role ARN.</p>",
        "GetVariantStoreResponse$storeArn": "<p>The store's ARN.</p>",
        "StartAnnotationImportRequest$roleArn": "<p>A service role for the job.</p>",
        "StartVariantImportRequest$roleArn": "<p>A service role for the job.</p>",
        "VariantImportJobItem$roleArn": "<p>The job's service role ARN.</p>",
        "VariantStoreItem$storeArn": "<p>The store's ARN.</p>"
      }
    },
    "ArnList": {
      "base": null,
      "refs": {
        "Filter$resourceArns": "<p>Filter based on the Amazon Resource Number (ARN) of the resource. You can specify up to 10 values.</p>"
      }
    },
    "AwsAccountId": {
      "base": null,
      "refs": {
        "CreateRunCacheRequest$cacheBucketOwnerId": "<p>The Amazon Web Services account ID of the expected owner of the S3 bucket for the run cache. If not provided, your account ID is set as the owner of the bucket.</p>",
        "GetRunCacheResponse$cacheBucketOwnerId": "<p>The identifier of the bucket owner.</p>",
        "RegistryMapping$ecrAccountId": "<p>Account ID of the account that owns the upstream container image.</p>"
      }
    },
    "BatchDeleteReadSetRequest": {
      "base": null,
      "refs": {
      }
    },
    "BatchDeleteReadSetResponse": {
      "base": null,
      "refs": {
      }
    },
    "Blob": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$definitionZip": "<p>A ZIP archive containing the main workflow definition file and dependencies that it imports for the workflow. You can use a file with a ://fileb prefix instead of the Base64 string. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-defn-requirements.html\">Workflow definition requirements</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateWorkflowVersionRequest$definitionZip": "<p>A ZIP archive containing the main workflow definition file and dependencies that it imports for this workflow version. You can use a file with a ://fileb prefix instead of the Base64 string. For more information, see Workflow definition requirements in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>"
      }
    },
    "Boolean": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$cacheHit": "<p>Set to true if Amazon Web Services HealthOmics found a matching entry in the run cache for this task.</p>",
        "TaskListItem$cacheHit": "<p>Set to true if Amazon Web Services HealthOmics found a matching entry in the run cache for this task.</p>",
        "VcfOptions$ignoreQualField": "<p>The file's ignore qual field setting.</p>",
        "VcfOptions$ignoreFilterField": "<p>The file's ignore filter field setting.</p>",
        "WorkflowParameter$optional": "<p>Whether the parameter is optional.</p>"
      }
    },
    "CacheBehavior": {
      "base": null,
      "refs": {
        "CreateRunCacheRequest$cacheBehavior": "<p>Default cache behavior for runs that use this cache. Supported values are:</p> <p> <code>CACHE_ON_FAILURE</code>: Caches task outputs from completed tasks for runs that fail. This setting is useful if you're debugging a workflow that fails after several tasks completed successfully. The subsequent run uses the cache outputs for previously-completed tasks if the task definition, inputs, and container in ECR are identical to the prior run.</p> <p> <code>CACHE_ALWAYS</code>: Caches task outputs from completed tasks for all runs. This setting is useful in development mode, but do not use it in a production setting.</p> <p>If you don't specify a value, the default behavior is CACHE_ON_FAILURE. When you start a run that uses this cache, you can override the default cache behavior.</p> <p>For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/how-run-cache.html#run-cache-behavior\">Run cache behavior</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "GetRunCacheResponse$cacheBehavior": "<p>The default cache behavior for runs using this cache.</p>",
        "GetRunResponse$cacheBehavior": "<p>The run cache behavior for the run.</p>",
        "RunCacheListItem$cacheBehavior": "<p>Default cache behavior for the run cache.</p>",
        "StartRunRequest$cacheBehavior": "<p>The cache behavior for the run. You specify this value if you want to override the default behavior for the cache. You had set the default value when you created the cache. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/how-run-cache.html#run-cache-behavior\">Run cache behavior</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "UpdateRunCacheRequest$cacheBehavior": "<p>Update the default run cache behavior.</p>"
      }
    },
    "CancelAnnotationImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "CancelAnnotationImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "CancelRunRequest": {
      "base": null,
      "refs": {
      }
    },
    "CancelVariantImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "CancelVariantImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "ClientToken": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$clientToken": "<p>An idempotency token that can be used to avoid triggering multiple multipart uploads.</p>",
        "CreateReferenceStoreRequest$clientToken": "<p>To ensure that requests don't run multiple times, specify a unique token for each request.</p>",
        "CreateSequenceStoreRequest$clientToken": "<p>An idempotency token used to dedupe retry requests so that duplicate runs are not created.</p>",
        "StartReadSetActivationJobRequest$clientToken": "<p>To ensure that jobs don't run multiple times, specify a unique token for each job.</p>",
        "StartReadSetExportJobRequest$clientToken": "<p>To ensure that jobs don't run multiple times, specify a unique token for each job.</p>",
        "StartReadSetImportJobRequest$clientToken": "<p>To ensure that jobs don't run multiple times, specify a unique token for each job.</p>",
        "StartReferenceImportJobRequest$clientToken": "<p>To ensure that jobs don't run multiple times, specify a unique token for each job.</p>",
        "UpdateSequenceStoreRequest$clientToken": "<p>To ensure that requests don't run multiple times, specify a unique token for each request.</p>"
      }
    },
    "CommentChar": {
      "base": null,
      "refs": {
        "ReadOptions$comment": "<p>The file's comment character.</p>"
      }
    },
    "CompleteMultipartReadSetUploadRequest": {
      "base": null,
      "refs": {
      }
    },
    "CompleteMultipartReadSetUploadResponse": {
      "base": null,
      "refs": {
      }
    },
    "CompleteReadSetUploadPartList": {
      "base": null,
      "refs": {
        "CompleteMultipartReadSetUploadRequest$parts": "<p>The individual uploads or parts of a multipart upload.</p>"
      }
    },
    "CompleteReadSetUploadPartListItem": {
      "base": "<p> Part of the response to the CompleteReadSetUpload API, including metadata. </p>",
      "refs": {
        "CompleteReadSetUploadPartList$member": null
      }
    },
    "CompleteReadSetUploadPartListItemChecksumString": {
      "base": null,
      "refs": {
        "CompleteReadSetUploadPartListItem$checksum": "<p> A unique identifier used to confirm that parts are being added to the correct upload. </p>"
      }
    },
    "CompleteReadSetUploadPartListItemPartNumberInteger": {
      "base": null,
      "refs": {
        "CompleteReadSetUploadPartListItem$partNumber": "<p> A number identifying the part in a read set upload. </p>"
      }
    },
    "CompletionTime": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$completionTime": "<p>When the job completed.</p>",
        "GetAnnotationImportResponse$completionTime": "<p>When the job completed.</p>",
        "GetVariantImportResponse$completionTime": "<p>When the job completed.</p>",
        "VariantImportJobItem$completionTime": "<p>When the job completed.</p>"
      }
    },
    "ConflictException": {
      "base": "<p>The request cannot be applied to the target resource in its current state.</p>",
      "refs": {
      }
    },
    "ConnectionArn": {
      "base": null,
      "refs": {
        "DefinitionRepository$connectionArn": "<p>The Amazon Resource Name (ARN) of the connection to the source code repository.</p>",
        "DefinitionRepositoryDetails$connectionArn": "<p>The Amazon Resource Name (ARN) of the connection to the source code repository.</p>"
      }
    },
    "ContainerRegistryMap": {
      "base": "<p>Use a container registry map to specify mappings between the ECR private repository and one or more upstream registries. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-ecr.html\">Container images</a> in the <i>Amazon Web Services HealthOmics User Guide</i>. </p>",
      "refs": {
        "CreateWorkflowRequest$containerRegistryMap": "<p>(Optional) Use a container registry map to specify mappings between the ECR private repository and one or more upstream registries. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-ecr.html\">Container images</a> in the <i>Amazon Web Services HealthOmics User Guide</i>. </p>",
        "CreateWorkflowVersionRequest$containerRegistryMap": "<p>(Optional) Use a container registry map to specify mappings between the ECR private repository and one or more upstream registries. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-ecr.html\">Container images</a> in the <i>Amazon Web Services HealthOmics User Guide</i>. </p>",
        "GetWorkflowResponse$containerRegistryMap": "<p>The registry map that this workflow is using.</p>",
        "GetWorkflowVersionResponse$containerRegistryMap": "<p>The registry map that this workflow version uses.</p>"
      }
    },
    "CreateAnnotationStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateAnnotationStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateAnnotationStoreVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateAnnotationStoreVersionResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateMultipartReadSetUploadRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateMultipartReadSetUploadResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateReferenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateReferenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateRunCacheRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateRunCacheResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateRunGroupRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateRunGroupRequestMaxCpusInteger": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$maxCpus": "<p>The maximum number of CPUs that can run concurrently across all active runs in the run group.</p>"
      }
    },
    "CreateRunGroupRequestMaxDurationInteger": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$maxDuration": "<p>The maximum time for each run (in minutes). If a run exceeds the maximum run time, the run fails automatically.</p>"
      }
    },
    "CreateRunGroupRequestMaxGpusInteger": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$maxGpus": "<p>The maximum number of GPUs that can run concurrently across all active runs in the run group.</p>"
      }
    },
    "CreateRunGroupRequestMaxRunsInteger": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$maxRuns": "<p>The maximum number of runs that can be running at the same time.</p>"
      }
    },
    "CreateRunGroupResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateSequenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateSequenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateShareRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateShareResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateVariantStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateVariantStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateWorkflowRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateWorkflowRequestStorageCapacityInteger": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$storageCapacity": "<p>The default static storage capacity (in gibibytes) for runs that use this workflow or workflow version. The <code>storageCapacity</code> can be overwritten at run time. The storage capacity is not required for runs with a <code>DYNAMIC</code> storage type.</p>"
      }
    },
    "CreateWorkflowResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreateWorkflowVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "CreateWorkflowVersionRequestStorageCapacityInteger": {
      "base": null,
      "refs": {
        "CreateWorkflowVersionRequest$storageCapacity": "<p>The default static storage capacity (in gibibytes) for runs that use this workflow version. The <code>storageCapacity</code> can be overwritten at run time. The storage capacity is not required for runs with a <code>DYNAMIC</code> storage type.</p>"
      }
    },
    "CreateWorkflowVersionResponse": {
      "base": null,
      "refs": {
      }
    },
    "CreationJobId": {
      "base": null,
      "refs": {
        "GetReadSetMetadataResponse$creationJobId": "<p>The read set's creation job ID.</p>",
        "GetReferenceMetadataResponse$creationJobId": "<p>The reference's creation job ID.</p>"
      }
    },
    "CreationTime": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$creationTime": "<p>When the job was created.</p>",
        "AnnotationStoreItem$creationTime": "<p>The store's creation time.</p>",
        "AnnotationStoreVersionItem$creationTime": "<p> The time stamp for when an annotation store version was created. </p>",
        "CreateAnnotationStoreResponse$creationTime": "<p>When the store was created.</p>",
        "CreateAnnotationStoreVersionResponse$creationTime": "<p> The time stamp for the creation of an annotation store version. </p>",
        "CreateVariantStoreResponse$creationTime": "<p>When the store was created.</p>",
        "GetAnnotationImportResponse$creationTime": "<p>When the job was created.</p>",
        "GetAnnotationStoreResponse$creationTime": "<p>When the store was created.</p>",
        "GetAnnotationStoreVersionResponse$creationTime": "<p> The time stamp for when an annotation store version was created. </p>",
        "GetVariantImportResponse$creationTime": "<p>When the job was created.</p>",
        "GetVariantStoreResponse$creationTime": "<p>When the store was created.</p>",
        "ShareDetails$creationTime": "<p>The timestamp of when the resource share was created.</p>",
        "UpdateAnnotationStoreResponse$creationTime": "<p>When the store was created.</p>",
        "UpdateAnnotationStoreVersionResponse$creationTime": "<p> The time stamp for when an annotation store version was created. </p>",
        "UpdateVariantStoreResponse$creationTime": "<p>When the store was created.</p>",
        "VariantImportJobItem$creationTime": "<p>When the job was created.</p>",
        "VariantStoreItem$creationTime": "<p>When the store was created.</p>"
      }
    },
    "CreationType": {
      "base": null,
      "refs": {
        "GetReadSetMetadataResponse$creationType": "<p> The creation type of the read set. </p>",
        "ReadSetFilter$creationType": "<p> The creation type of the read set. </p>",
        "ReadSetListItem$creationType": "<p> The creation type of the read set. </p>"
      }
    },
    "DefinitionRepository": {
      "base": "<p>Contains information about a source code repository that hosts the workflow definition files.</p>",
      "refs": {
        "CreateWorkflowRequest$definitionRepository": "<p>The repository information for the workflow definition. This allows you to source your workflow definition directly from a code repository.</p>",
        "CreateWorkflowVersionRequest$definitionRepository": "<p>The repository information for the workflow version definition. This allows you to source your workflow version definition directly from a code repository.</p>"
      }
    },
    "DefinitionRepositoryDetails": {
      "base": "<p>Contains detailed information about the source code repository that hosts the workflow definition files.</p>",
      "refs": {
        "GetWorkflowResponse$definitionRepositoryDetails": "<p>Details about the source code repository that hosts the workflow definition files.</p>",
        "GetWorkflowVersionResponse$definitionRepositoryDetails": "<p>Details about the source code repository that hosts the workflow version definition files.</p>"
      }
    },
    "DeleteAnnotationStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteAnnotationStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteAnnotationStoreVersionsRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteAnnotationStoreVersionsResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteReferenceRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteReferenceResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteReferenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteReferenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteRunCacheRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteRunGroupRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteRunRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteS3AccessPolicyRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteS3AccessPolicyResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteSequenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteSequenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteShareRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteShareResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteVariantStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteVariantStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "DeleteWorkflowRequest": {
      "base": null,
      "refs": {
      }
    },
    "DeleteWorkflowVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "Description": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$description": "<p>The store's description.</p>",
        "AnnotationStoreVersionItem$description": "<p> The description of an annotation store version. </p>",
        "CreateAnnotationStoreRequest$description": "<p>A description for the store.</p>",
        "CreateAnnotationStoreVersionRequest$description": "<p> The description of an annotation store version. </p>",
        "CreateVariantStoreRequest$description": "<p>A description for the store.</p>",
        "GetAnnotationStoreResponse$description": "<p>The store's description.</p>",
        "GetAnnotationStoreVersionResponse$description": "<p> The description for an annotation store version. </p>",
        "GetVariantStoreResponse$description": "<p>The store's description.</p>",
        "UpdateAnnotationStoreRequest$description": "<p>A description for the store.</p>",
        "UpdateAnnotationStoreResponse$description": "<p>The store's description.</p>",
        "UpdateAnnotationStoreVersionRequest$description": "<p> The description of an annotation store. </p>",
        "UpdateAnnotationStoreVersionResponse$description": "<p> The description of an annotation store version. </p>",
        "UpdateVariantStoreRequest$description": "<p>A description for the store.</p>",
        "UpdateVariantStoreResponse$description": "<p>The store's description.</p>",
        "VariantStoreItem$description": "<p>The store's description.</p>"
      }
    },
    "ETag": {
      "base": "<p>The entity tag (ETag) is a hash of the object representing its semantic content.</p>",
      "refs": {
        "GetReadSetMetadataResponse$etag": "<p>The entity tag (ETag) is a hash of the object meant to represent its semantic content.</p>",
        "ReadSetListItem$etag": "<p>The entity tag (ETag) is a hash of the object representing its semantic content.</p>"
      }
    },
    "ETagAlgorithm": {
      "base": null,
      "refs": {
        "ETag$algorithm": "<p>The algorithm used to calculate the read set’s ETag(s).</p>"
      }
    },
    "ETagAlgorithmFamily": {
      "base": null,
      "refs": {
        "CreateSequenceStoreRequest$eTagAlgorithmFamily": "<p>The ETag algorithm family to use for ingested read sets. The default value is MD5up. For more information on ETags, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/etags-and-provenance.html\">ETags and data provenance</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateSequenceStoreResponse$eTagAlgorithmFamily": "<p>The algorithm family of the ETag.</p>",
        "GetSequenceStoreResponse$eTagAlgorithmFamily": "<p>The algorithm family of the ETag.</p>",
        "SequenceStoreDetail$eTagAlgorithmFamily": "<p>The algorithm family of the ETag.</p>",
        "UpdateSequenceStoreResponse$eTagAlgorithmFamily": "<p>The ETag algorithm family to use on ingested read sets.</p>"
      }
    },
    "EcrRepositoryPrefix": {
      "base": null,
      "refs": {
        "RegistryMapping$ecrRepositoryPrefix": "<p>The repository prefix to use in the ECR private repository.</p>"
      }
    },
    "Encoding": {
      "base": null,
      "refs": {
        "ReadOptions$encoding": "<p>The file's encoding.</p>"
      }
    },
    "EncryptionType": {
      "base": null,
      "refs": {
        "SseConfig$type": "<p>The encryption type.</p>"
      }
    },
    "EngineLogStream": {
      "base": null,
      "refs": {
        "RunLogLocation$engineLogStream": "<p>The log stream ARN for the engine log.</p>"
      }
    },
    "EngineVersion": {
      "base": null,
      "refs": {
        "GetRunResponse$engineVersion": "<p>The actual Nextflow engine version that Amazon Web Services HealthOmics used for the run. The other workflow definition languages don't provide a value for this field.</p>"
      }
    },
    "EscapeChar": {
      "base": null,
      "refs": {
        "ReadOptions$escape": "<p>A character for escaping quotes in the file.</p>"
      }
    },
    "EscapeQuotes": {
      "base": null,
      "refs": {
        "ReadOptions$escapeQuotes": "<p>Whether quotes need to be escaped in the file.</p>"
      }
    },
    "ExcludeFilePatternList": {
      "base": null,
      "refs": {
        "DefinitionRepository$excludeFilePatterns": "<p>A list of file patterns to exclude when retrieving the workflow definition from the repository.</p>"
      }
    },
    "ExportJobId": {
      "base": null,
      "refs": {
        "ExportReadSetJobDetail$id": "<p>The job's ID.</p>",
        "GetReadSetExportJobRequest$id": "<p>The job's ID.</p>",
        "GetReadSetExportJobResponse$id": "<p>The job's ID.</p>",
        "StartReadSetExportJobResponse$id": "<p>The job's ID.</p>"
      }
    },
    "ExportReadSet": {
      "base": "<p>A read set.</p>",
      "refs": {
        "StartReadSetExportJobRequestSourcesList$member": null
      }
    },
    "ExportReadSetDetail": {
      "base": "<p>Details about a read set.</p>",
      "refs": {
        "ExportReadSetDetailList$member": null
      }
    },
    "ExportReadSetDetailList": {
      "base": null,
      "refs": {
        "GetReadSetExportJobResponse$readSets": "<p>The job's read sets.</p>"
      }
    },
    "ExportReadSetFilter": {
      "base": "<p>An read set export job filter.</p>",
      "refs": {
        "ListReadSetExportJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ExportReadSetJobDetail": {
      "base": "<p>Details about a read set export job.</p>",
      "refs": {
        "ExportReadSetJobDetailList$member": null
      }
    },
    "ExportReadSetJobDetailList": {
      "base": null,
      "refs": {
        "ListReadSetExportJobsResponse$exportJobs": "<p>A list of jobs.</p>"
      }
    },
    "FallbackLocation": {
      "base": null,
      "refs": {
        "CreateSequenceStoreRequest$fallbackLocation": "<p>An S3 location that is used to store files that have failed a direct upload. You can add or change the <code>fallbackLocation</code> after creating a sequence store. This is not required if you are uploading files from a different S3 bucket.</p>",
        "CreateSequenceStoreResponse$fallbackLocation": "<p>An S3 location that is used to store files that have failed a direct upload.</p>",
        "GetSequenceStoreResponse$fallbackLocation": "<p>An S3 location that is used to store files that have failed a direct upload.</p>",
        "SequenceStoreDetail$fallbackLocation": "<p> An S3 location that is used to store files that have failed a direct upload. </p>",
        "UpdateSequenceStoreRequest$fallbackLocation": "<p>The S3 URI of a bucket and folder to store Read Sets that fail to upload.</p>",
        "UpdateSequenceStoreResponse$fallbackLocation": "<p>The S3 URI of a bucket and folder to store Read Sets that fail to upload.</p>"
      }
    },
    "FileInformation": {
      "base": "<p>Details about a file.</p>",
      "refs": {
        "ReadSetFiles$source1": "<p>The location of the first file in Amazon S3.</p>",
        "ReadSetFiles$source2": "<p>The location of the second file in Amazon S3.</p>",
        "ReadSetFiles$index": "<p>The files' index.</p>",
        "ReferenceFiles$source": "<p>The source file's location in Amazon S3.</p>",
        "ReferenceFiles$index": "<p>The files' index.</p>"
      }
    },
    "FileInformationContentLengthLong": {
      "base": null,
      "refs": {
        "FileInformation$contentLength": "<p>The file's content length.</p>"
      }
    },
    "FileInformationPartSizeLong": {
      "base": null,
      "refs": {
        "FileInformation$partSize": "<p>The file's part size.</p>"
      }
    },
    "FileInformationTotalPartsInteger": {
      "base": null,
      "refs": {
        "FileInformation$totalParts": "<p>The file's total parts.</p>"
      }
    },
    "FileType": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$sourceFileType": "<p>The type of file being uploaded.</p>",
        "CreateMultipartReadSetUploadResponse$sourceFileType": "<p>The file type of the read set source.</p>",
        "GetReadSetMetadataResponse$fileType": "<p>The read set's file type.</p>",
        "ImportReadSetSourceItem$sourceFileType": "<p>The source's file type.</p>",
        "MultipartReadSetUploadListItem$sourceFileType": "<p> The type of file the read set originated from. </p>",
        "ReadSetListItem$fileType": "<p>The read set's file type.</p>",
        "StartReadSetImportJobSourceItem$sourceFileType": "<p>The source's file type.</p>"
      }
    },
    "Filter": {
      "base": "<p>Use filters to return a subset of resources. You can define filters for specific parameters, such as the resource status.</p>",
      "refs": {
        "ListSharesRequest$filter": "<p>Attributes that you use to filter for a specific subset of resource shares.</p>"
      }
    },
    "FormatOptions": {
      "base": "<p>Formatting options for a file.</p>",
      "refs": {
        "GetAnnotationImportResponse$formatOptions": null,
        "StartAnnotationImportRequest$formatOptions": "<p>Formatting options for the annotation file.</p>"
      }
    },
    "FormatToHeader": {
      "base": null,
      "refs": {
        "TsvStoreOptions$formatToHeader": "<p>The store's header key to column name mapping.</p>",
        "TsvVersionOptions$formatToHeader": "<p> The annotation store version's header key to column name mapping. </p>"
      }
    },
    "FormatToHeaderKey": {
      "base": null,
      "refs": {
        "FormatToHeader$key": null
      }
    },
    "FormatToHeaderValueString": {
      "base": null,
      "refs": {
        "FormatToHeader$value": null
      }
    },
    "FullRepositoryId": {
      "base": null,
      "refs": {
        "DefinitionRepository$fullRepositoryId": "<p>The full repository identifier, including the repository owner and name. For example, 'repository-owner/repository-name'.</p>",
        "DefinitionRepositoryDetails$fullRepositoryId": "<p>The full repository identifier, including the repository owner and name. For example, 'repository-owner/repository-name'.</p>"
      }
    },
    "GeneratedFrom": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$generatedFrom": "<p>Where the source originated.</p>",
        "CreateMultipartReadSetUploadResponse$generatedFrom": "<p>The source of the read set.</p>",
        "ImportReadSetSourceItem$generatedFrom": "<p>Where the source originated.</p>",
        "MultipartReadSetUploadListItem$generatedFrom": "<p> The source of an uploaded part. </p>",
        "ReadSetFilter$generatedFrom": "<p> Where the source originated. </p>",
        "SequenceInformation$generatedFrom": "<p>Where the sequence originated.</p>",
        "StartReadSetImportJobSourceItem$generatedFrom": "<p>Where the source originated.</p>"
      }
    },
    "GetAnnotationImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetAnnotationImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetAnnotationStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetAnnotationStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetAnnotationStoreVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetAnnotationStoreVersionResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetActivationJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetActivationJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetExportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetExportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetImportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetImportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetMetadataRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetMetadataResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReadSetRequestPartNumberInteger": {
      "base": null,
      "refs": {
        "GetReadSetRequest$partNumber": "<p>The part number to retrieve.</p>"
      }
    },
    "GetReadSetResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceImportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceImportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceMetadataRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceMetadataResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceRequestPartNumberInteger": {
      "base": null,
      "refs": {
        "GetReferenceRequest$partNumber": "<p>The part number to retrieve.</p>"
      }
    },
    "GetReferenceResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetReferenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetRunCacheRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetRunCacheResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetRunGroupRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetRunGroupResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetRunGroupResponseMaxCpusInteger": {
      "base": null,
      "refs": {
        "GetRunGroupResponse$maxCpus": "<p>The group's maximum number of CPUs to use.</p>"
      }
    },
    "GetRunGroupResponseMaxDurationInteger": {
      "base": null,
      "refs": {
        "GetRunGroupResponse$maxDuration": "<p>The group's maximum run time in minutes.</p>"
      }
    },
    "GetRunGroupResponseMaxGpusInteger": {
      "base": null,
      "refs": {
        "GetRunGroupResponse$maxGpus": "<p>The maximum GPUs that can be used by a run group.</p>"
      }
    },
    "GetRunGroupResponseMaxRunsInteger": {
      "base": null,
      "refs": {
        "GetRunGroupResponse$maxRuns": "<p>The maximum number of concurrent runs for the group.</p>"
      }
    },
    "GetRunRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetRunResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetRunResponsePriorityInteger": {
      "base": null,
      "refs": {
        "GetRunResponse$priority": "<p>The run's priority.</p>"
      }
    },
    "GetRunResponseStorageCapacityInteger": {
      "base": null,
      "refs": {
        "GetRunResponse$storageCapacity": "<p>The run's storage capacity in gibibytes. For dynamic storage, after the run has completed, this value is the maximum amount of storage used during the run.</p>"
      }
    },
    "GetRunTaskRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetRunTaskResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetRunTaskResponseCpusInteger": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$cpus": "<p>The task's CPU usage.</p>"
      }
    },
    "GetRunTaskResponseGpusInteger": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$gpus": "<p>The number of Graphics Processing Units (GPU) specified in the task.</p>"
      }
    },
    "GetRunTaskResponseMemoryInteger": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$memory": "<p>The task's memory use in gigabytes.</p>"
      }
    },
    "GetS3AccessPolicyRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetS3AccessPolicyResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetSequenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetSequenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetShareRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetShareResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetVariantImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetVariantImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetVariantStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetVariantStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetWorkflowRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetWorkflowResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetWorkflowResponseStorageCapacityInteger": {
      "base": null,
      "refs": {
        "GetWorkflowResponse$storageCapacity": "<p>The default static storage capacity (in gibibytes) for runs that use this workflow or workflow version.</p>"
      }
    },
    "GetWorkflowVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "GetWorkflowVersionResponse": {
      "base": null,
      "refs": {
      }
    },
    "GetWorkflowVersionResponseStorageCapacityInteger": {
      "base": null,
      "refs": {
        "GetWorkflowVersionResponse$storageCapacity": "<p>The default run storage capacity for static storage.</p>"
      }
    },
    "Header": {
      "base": null,
      "refs": {
        "ReadOptions$header": "<p>Whether the file has a header row.</p>"
      }
    },
    "ImageDetails": {
      "base": "<p>Information about the container image used for a task.</p>",
      "refs": {
        "GetRunTaskResponse$imageDetails": "<p>Details about the container image that this task uses.</p>"
      }
    },
    "ImageMapping": {
      "base": "<p>Specifies image mappings that workflow tasks can use. For example, you can replace all the task references of a public image to use an equivalent image in your private ECR repository. You can use image mappings with upstream registries that don't support pull through cache. You need to manually synchronize the upstream registry with your private repository. </p>",
      "refs": {
        "ImageMappingsList$member": null
      }
    },
    "ImageMappingsList": {
      "base": null,
      "refs": {
        "ContainerRegistryMap$imageMappings": "<p>Image mappings specify path mappings between the ECR private repository and their corresponding external repositories.</p>"
      }
    },
    "ImportJobId": {
      "base": null,
      "refs": {
        "GetReadSetImportJobRequest$id": "<p>The job's ID.</p>",
        "GetReadSetImportJobResponse$id": "<p>The job's ID.</p>",
        "GetReferenceImportJobRequest$id": "<p>The job's ID.</p>",
        "GetReferenceImportJobResponse$id": "<p>The job's ID.</p>",
        "ImportReadSetJobItem$id": "<p>The job's ID.</p>",
        "ImportReferenceJobItem$id": "<p>The job's ID.</p>",
        "StartReadSetImportJobResponse$id": "<p>The job's ID.</p>",
        "StartReferenceImportJobResponse$id": "<p>The job's ID.</p>"
      }
    },
    "ImportReadSetFilter": {
      "base": "<p>A filter for import read set jobs.</p>",
      "refs": {
        "ListReadSetImportJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ImportReadSetJobItem": {
      "base": "<p>An import read set job.</p>",
      "refs": {
        "ImportReadSetJobList$member": null
      }
    },
    "ImportReadSetJobList": {
      "base": null,
      "refs": {
        "ListReadSetImportJobsResponse$importJobs": "<p>A list of jobs.</p>"
      }
    },
    "ImportReadSetSourceItem": {
      "base": "<p>A source for an import read set job.</p>",
      "refs": {
        "ImportReadSetSourceList$member": null
      }
    },
    "ImportReadSetSourceList": {
      "base": null,
      "refs": {
        "GetReadSetImportJobResponse$sources": "<p>The job's source files.</p>"
      }
    },
    "ImportReferenceFilter": {
      "base": "<p>A filter for import references.</p>",
      "refs": {
        "ListReferenceImportJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ImportReferenceJobItem": {
      "base": "<p>An import reference job.</p>",
      "refs": {
        "ImportReferenceJobList$member": null
      }
    },
    "ImportReferenceJobList": {
      "base": null,
      "refs": {
        "ListReferenceImportJobsResponse$importJobs": "<p>A lis of jobs.</p>"
      }
    },
    "ImportReferenceSourceItem": {
      "base": "<p>An genome reference source.</p>",
      "refs": {
        "ImportReferenceSourceList$member": null
      }
    },
    "ImportReferenceSourceList": {
      "base": null,
      "refs": {
        "GetReferenceImportJobResponse$sources": "<p>The job's source files.</p>"
      }
    },
    "Integer": {
      "base": null,
      "refs": {
        "GetAnnotationStoreResponse$numVersions": "<p> An integer indicating how many versions of an annotation store exist. </p>",
        "ListSharesRequest$maxResults": "<p>The maximum number of shares to return in one page of results.</p>"
      }
    },
    "InternalServerException": {
      "base": "<p>An unexpected error occurred. Try the request again.</p>",
      "refs": {
      }
    },
    "JobStatus": {
      "base": null,
      "refs": {
        "AnnotationImportItemDetail$jobStatus": "<p>The item's job status.</p>",
        "AnnotationImportJobItem$status": "<p>The job's status.</p>",
        "GetAnnotationImportResponse$status": "<p>The job's status.</p>",
        "GetVariantImportResponse$status": "<p>The job's status.</p>",
        "ListAnnotationImportJobsFilter$status": "<p>A status to filter on.</p>",
        "ListVariantImportJobsFilter$status": "<p>A status to filter on.</p>",
        "VariantImportItemDetail$jobStatus": "<p>The item's job status.</p>",
        "VariantImportJobItem$status": "<p>The job's status.</p>"
      }
    },
    "JobStatusMessage": {
      "base": null,
      "refs": {
        "ActivateReadSetSourceItem$statusMessage": "<p>The source's status message.</p>",
        "ExportReadSetDetail$statusMessage": "<p>The set's status message.</p>",
        "GetReadSetActivationJobResponse$statusMessage": "<p>The job's status message.</p>",
        "GetReadSetExportJobResponse$statusMessage": "<p>The job's status message.</p>",
        "GetReadSetImportJobResponse$statusMessage": "<p>The job's status message.</p>",
        "GetReferenceImportJobResponse$statusMessage": "<p>The job's status message.</p>",
        "ImportReadSetSourceItem$statusMessage": "<p>The source's status message.</p>",
        "ImportReferenceSourceItem$statusMessage": "<p>The source's status message.</p>"
      }
    },
    "JobStatusMsg": {
      "base": null,
      "refs": {
        "GetAnnotationImportResponse$statusMessage": "<p>The job's status message.</p>",
        "GetVariantImportResponse$statusMessage": "<p>The job's status message.</p>",
        "VariantImportItemDetail$statusMessage": "<p> A message that provides additional context about a job </p>"
      }
    },
    "LineSep": {
      "base": null,
      "refs": {
        "ReadOptions$lineSep": "<p>A line separator for the file.</p>"
      }
    },
    "ListAnnotationImportJobsFilter": {
      "base": "<p>A filter for annotation import jobs.</p>",
      "refs": {
        "ListAnnotationImportJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ListAnnotationImportJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListAnnotationImportJobsRequestIdsList": {
      "base": null,
      "refs": {
        "ListAnnotationImportJobsRequest$ids": "<p>IDs of annotation import jobs to retrieve.</p>"
      }
    },
    "ListAnnotationImportJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListAnnotationImportJobsRequest$maxResults": "<p>The maximum number of jobs to return in one page of results.</p>"
      }
    },
    "ListAnnotationImportJobsRequestNextTokenString": {
      "base": null,
      "refs": {
        "ListAnnotationImportJobsRequest$nextToken": "<p>Specifies the pagination token from a previous request to retrieve the next page of results.</p>"
      }
    },
    "ListAnnotationImportJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListAnnotationStoreVersionsFilter": {
      "base": "<p>Use filters to focus the returned annotation store versions on a specific parameter, such as the status of the annotation store.</p>",
      "refs": {
        "ListAnnotationStoreVersionsRequest$filter": "<p> A filter to apply to the list of annotation store versions. </p>"
      }
    },
    "ListAnnotationStoreVersionsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListAnnotationStoreVersionsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListAnnotationStoreVersionsRequest$maxResults": "<p> The maximum number of annotation store versions to return in one page of results. </p>"
      }
    },
    "ListAnnotationStoreVersionsRequestNextTokenString": {
      "base": null,
      "refs": {
        "ListAnnotationStoreVersionsRequest$nextToken": "<p> Specifies the pagination token from a previous request to retrieve the next page of results. </p>"
      }
    },
    "ListAnnotationStoreVersionsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListAnnotationStoresFilter": {
      "base": "<p>A filter for annotation stores.</p>",
      "refs": {
        "ListAnnotationStoresRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ListAnnotationStoresRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListAnnotationStoresRequestIdsList": {
      "base": null,
      "refs": {
        "ListAnnotationStoresRequest$ids": "<p>IDs of stores to list.</p>"
      }
    },
    "ListAnnotationStoresRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListAnnotationStoresRequest$maxResults": "<p>The maximum number of stores to return in one page of results.</p>"
      }
    },
    "ListAnnotationStoresRequestNextTokenString": {
      "base": null,
      "refs": {
        "ListAnnotationStoresRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>"
      }
    },
    "ListAnnotationStoresResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListMultipartReadSetUploadsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListMultipartReadSetUploadsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListMultipartReadSetUploadsRequest$maxResults": "<p>The maximum number of multipart uploads returned in a page.</p>"
      }
    },
    "ListMultipartReadSetUploadsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetActivationJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetActivationJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReadSetActivationJobsRequest$maxResults": "<p>The maximum number of read set activation jobs to return in one page of results.</p>"
      }
    },
    "ListReadSetActivationJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetExportJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetExportJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReadSetExportJobsRequest$maxResults": "<p>The maximum number of jobs to return in one page of results.</p>"
      }
    },
    "ListReadSetExportJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetImportJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetImportJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReadSetImportJobsRequest$maxResults": "<p>The maximum number of jobs to return in one page of results.</p>"
      }
    },
    "ListReadSetImportJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetUploadPartsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetUploadPartsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReadSetUploadPartsRequest$maxResults": "<p>The maximum number of read set upload parts returned in a page.</p>"
      }
    },
    "ListReadSetUploadPartsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReadSetsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReadSetsRequest$maxResults": "<p>The maximum number of read sets to return in one page of results.</p>"
      }
    },
    "ListReadSetsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReferenceImportJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReferenceImportJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReferenceImportJobsRequest$maxResults": "<p>The maximum number of jobs to return in one page of results.</p>"
      }
    },
    "ListReferenceImportJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReferenceStoresRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReferenceStoresRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReferenceStoresRequest$maxResults": "<p>The maximum number of stores to return in one page of results.</p>"
      }
    },
    "ListReferenceStoresResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListReferencesRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListReferencesRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListReferencesRequest$maxResults": "<p>The maximum number of references to return in one page of results.</p>"
      }
    },
    "ListReferencesResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListRunCachesRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListRunCachesRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListRunCachesRequest$maxResults": "<p>The maximum number of results to return.</p>"
      }
    },
    "ListRunCachesResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListRunGroupsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListRunGroupsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListRunGroupsRequest$maxResults": "<p>The maximum number of run groups to return in one page of results.</p>"
      }
    },
    "ListRunGroupsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListRunTasksRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListRunTasksRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListRunTasksRequest$maxResults": "<p>The maximum number of run tasks to return in one page of results.</p>"
      }
    },
    "ListRunTasksResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListRunsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListRunsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListRunsRequest$maxResults": "<p>The maximum number of runs to return in one page of results.</p>"
      }
    },
    "ListRunsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListSequenceStoresRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListSequenceStoresRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListSequenceStoresRequest$maxResults": "<p>The maximum number of stores to return in one page of results.</p>"
      }
    },
    "ListSequenceStoresResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListSharesRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListSharesResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListTagsForResourceRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListTagsForResourceResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListToken": {
      "base": null,
      "refs": {
        "ListRunCachesRequest$startingToken": "<p>Optional pagination token returned from a prior call to the <code>ListRunCaches</code> API operation.</p>",
        "ListRunCachesResponse$nextToken": "<p>Pagination token to retrieve additional run caches. If the response does not have a <code>nextToken</code>value, you have reached to the end of the list.</p>"
      }
    },
    "ListVariantImportJobsFilter": {
      "base": "<p>A filter for variant import jobs.</p>",
      "refs": {
        "ListVariantImportJobsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ListVariantImportJobsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListVariantImportJobsRequestIdsList": {
      "base": null,
      "refs": {
        "ListVariantImportJobsRequest$ids": "<p>A list of job IDs.</p>"
      }
    },
    "ListVariantImportJobsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListVariantImportJobsRequest$maxResults": "<p>The maximum number of import jobs to return in one page of results.</p>"
      }
    },
    "ListVariantImportJobsRequestNextTokenString": {
      "base": null,
      "refs": {
        "ListVariantImportJobsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>"
      }
    },
    "ListVariantImportJobsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListVariantStoresFilter": {
      "base": "<p>A filter for variant stores.</p>",
      "refs": {
        "ListVariantStoresRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ListVariantStoresRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListVariantStoresRequestIdsList": {
      "base": null,
      "refs": {
        "ListVariantStoresRequest$ids": "<p>A list of store IDs.</p>"
      }
    },
    "ListVariantStoresRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListVariantStoresRequest$maxResults": "<p>The maximum number of stores to return in one page of results.</p>"
      }
    },
    "ListVariantStoresRequestNextTokenString": {
      "base": null,
      "refs": {
        "ListVariantStoresRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>"
      }
    },
    "ListVariantStoresResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListWorkflowVersionsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListWorkflowVersionsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListWorkflowVersionsRequest$maxResults": "<p>The maximum number of workflows to return in one page of results.</p>"
      }
    },
    "ListWorkflowVersionsResponse": {
      "base": null,
      "refs": {
      }
    },
    "ListWorkflowsRequest": {
      "base": null,
      "refs": {
      }
    },
    "ListWorkflowsRequestMaxResultsInteger": {
      "base": null,
      "refs": {
        "ListWorkflowsRequest$maxResults": "<p>The maximum number of workflows to return in one page of results.</p>"
      }
    },
    "ListWorkflowsResponse": {
      "base": null,
      "refs": {
      }
    },
    "Long": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$storeSizeBytes": "<p>The store's size in bytes.</p>",
        "AnnotationStoreVersionItem$versionSizeBytes": "<p> The size of an annotation store version in Bytes. </p>",
        "GetAnnotationStoreResponse$storeSizeBytes": "<p>The store's size in bytes.</p>",
        "GetAnnotationStoreVersionResponse$versionSizeBytes": "<p> The size of the annotation store version in Bytes. </p>",
        "GetVariantStoreResponse$storeSizeBytes": "<p>The store's size in bytes.</p>",
        "SequenceInformation$totalReadCount": "<p>The sequence's total read count.</p>",
        "SequenceInformation$totalBaseCount": "<p>The sequence's total base count.</p>",
        "VariantStoreItem$storeSizeBytes": "<p>The store's size in bytes.</p>"
      }
    },
    "Md5": {
      "base": null,
      "refs": {
        "GetReferenceMetadataResponse$md5": "<p>The reference's MD5 checksum.</p>",
        "ReferenceFilter$md5": "<p>An MD5 checksum to filter on.</p>",
        "ReferenceListItem$md5": "<p>The reference's MD5 checksum.</p>"
      }
    },
    "MultipartReadSetUploadList": {
      "base": null,
      "refs": {
        "ListMultipartReadSetUploadsResponse$uploads": "<p>An array of multipart uploads.</p>"
      }
    },
    "MultipartReadSetUploadListItem": {
      "base": "<p> Part of the response to ListMultipartReadSetUploads, excluding completed and aborted multipart uploads. </p>",
      "refs": {
        "MultipartReadSetUploadList$member": null
      }
    },
    "NextToken": {
      "base": null,
      "refs": {
        "ListMultipartReadSetUploadsRequest$nextToken": "<p>Next token returned in the response of a previous ListMultipartReadSetUploads call. Used to get the next page of results.</p>",
        "ListMultipartReadSetUploadsResponse$nextToken": "<p>Next token returned in the response of a previous ListMultipartReadSetUploads call. Used to get the next page of results.</p>",
        "ListReadSetActivationJobsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReadSetActivationJobsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReadSetExportJobsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReadSetExportJobsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReadSetImportJobsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReadSetImportJobsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReadSetUploadPartsRequest$nextToken": "<p>Next token returned in the response of a previous ListReadSetUploadPartsRequest call. Used to get the next page of results.</p>",
        "ListReadSetUploadPartsResponse$nextToken": "<p>Next token returned in the response of a previous ListReadSetUploadParts call. Used to get the next page of results.</p>",
        "ListReadSetsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReadSetsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReferenceImportJobsRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReferenceImportJobsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReferenceStoresRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReferenceStoresResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListReferencesRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListReferencesResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListSequenceStoresRequest$nextToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListSequenceStoresResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "NotSupportedOperationException": {
      "base": "<p> The operation is not supported by Amazon Omics, or the API does not exist. </p>",
      "refs": {
      }
    },
    "NumericIdInArn": {
      "base": null,
      "refs": {
        "GetRunResponse$cacheId": "<p>The run cache associated with the run.</p>",
        "StartRunRequest$cacheId": "<p>Identifier of the cache associated with this run. If you don't specify a cache ID, no task outputs are cached for this run.</p>"
      }
    },
    "ParameterTemplatePath": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$parameterTemplatePath": "<p>The path to the workflow parameter template JSON file within the repository. This file defines the input parameters for runs that use this workflow. If not specified, the workflow will be created without a parameter template.</p>",
        "CreateWorkflowVersionRequest$parameterTemplatePath": "<p>The path to the workflow version parameter template JSON file within the repository. This file defines the input parameters for runs that use this workflow version. If not specified, the workflow version will be created without a parameter template.</p>"
      }
    },
    "PrimitiveBoolean": {
      "base": null,
      "refs": {
        "DeleteAnnotationStoreRequest$force": "<p>Whether to force deletion.</p>",
        "DeleteAnnotationStoreVersionsRequest$force": "<p> Forces the deletion of an annotation store version when imports are in-progress.. </p>",
        "DeleteVariantStoreRequest$force": "<p>Whether to force deletion.</p>"
      }
    },
    "PropagatedSetLevelTags": {
      "base": null,
      "refs": {
        "CreateSequenceStoreRequest$propagatedSetLevelTags": "<p>The tags keys to propagate to the S3 objects associated with read sets in the sequence store. These tags can be used as input to add metadata to your read sets.</p>",
        "CreateSequenceStoreResponse$propagatedSetLevelTags": "<p>The tags keys to propagate to the S3 objects associated with read sets in the sequence store.</p>",
        "GetSequenceStoreResponse$propagatedSetLevelTags": "<p>The tags keys to propagate to the S3 objects associated with read sets in the sequence store.</p>",
        "UpdateSequenceStoreRequest$propagatedSetLevelTags": "<p>The tags keys to propagate to the S3 objects associated with read sets in the sequence store.</p>",
        "UpdateSequenceStoreResponse$propagatedSetLevelTags": "<p>The tags keys to propagate to the S3 objects associated with read sets in the sequence store.</p>"
      }
    },
    "PutS3AccessPolicyRequest": {
      "base": null,
      "refs": {
      }
    },
    "PutS3AccessPolicyResponse": {
      "base": null,
      "refs": {
      }
    },
    "Quote": {
      "base": null,
      "refs": {
        "ReadOptions$quote": "<p>The file's quote character.</p>"
      }
    },
    "QuoteAll": {
      "base": null,
      "refs": {
        "ReadOptions$quoteAll": "<p>Whether all values need to be quoted, or just those that contain quotes.</p>"
      }
    },
    "Range": {
      "base": null,
      "refs": {
        "GetReferenceRequest$range": "<p>The range to retrieve.</p>"
      }
    },
    "RangeNotSatisfiableException": {
      "base": "<p>The ranges specified in the request are not valid.</p>",
      "refs": {
      }
    },
    "ReadOptions": {
      "base": "<p>Read options for an annotation import job.</p>",
      "refs": {
        "TsvOptions$readOptions": "<p>The file's read options.</p>"
      }
    },
    "ReadSetActivationJobItemStatus": {
      "base": null,
      "refs": {
        "ActivateReadSetSourceItem$status": "<p>The source's status.</p>"
      }
    },
    "ReadSetActivationJobStatus": {
      "base": null,
      "refs": {
        "ActivateReadSetFilter$status": "<p>The filter's status.</p>",
        "ActivateReadSetJobItem$status": "<p>The job's status.</p>",
        "GetReadSetActivationJobResponse$status": "<p>The job's status.</p>",
        "StartReadSetActivationJobResponse$status": "<p>The job's status.</p>"
      }
    },
    "ReadSetArn": {
      "base": null,
      "refs": {
        "GetReadSetMetadataResponse$arn": "<p>The read set's ARN.</p>",
        "ReadSetListItem$arn": "<p>The read set's ARN.</p>"
      }
    },
    "ReadSetBatchError": {
      "base": "<p>An error from a batch read set operation.</p>",
      "refs": {
        "ReadSetBatchErrorList$member": null
      }
    },
    "ReadSetBatchErrorList": {
      "base": null,
      "refs": {
        "BatchDeleteReadSetResponse$errors": "<p>Errors returned by individual delete operations.</p>"
      }
    },
    "ReadSetDescription": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$description": "<p>The description of the read set.</p>",
        "CreateMultipartReadSetUploadResponse$description": "<p>The description of the read set.</p>",
        "GetReadSetMetadataResponse$description": "<p>The read set's description.</p>",
        "ImportReadSetSourceItem$description": "<p>The source's description.</p>",
        "MultipartReadSetUploadListItem$description": "<p> The description of a read set. </p>",
        "ReadSetListItem$description": "<p>The read set's description.</p>",
        "StartReadSetImportJobSourceItem$description": "<p>The source's description.</p>"
      }
    },
    "ReadSetExportJobItemStatus": {
      "base": null,
      "refs": {
        "ExportReadSetDetail$status": "<p>The set's status.</p>"
      }
    },
    "ReadSetExportJobStatus": {
      "base": null,
      "refs": {
        "ExportReadSetFilter$status": "<p>A status to filter on.</p>",
        "ExportReadSetJobDetail$status": "<p>The job's status.</p>",
        "GetReadSetExportJobResponse$status": "<p>The job's status.</p>",
        "StartReadSetExportJobResponse$status": "<p>The job's status.</p>"
      }
    },
    "ReadSetFile": {
      "base": null,
      "refs": {
        "GetReadSetRequest$file": "<p>The file to retrieve.</p>"
      }
    },
    "ReadSetFiles": {
      "base": "<p>Files in a read set.</p>",
      "refs": {
        "GetReadSetMetadataResponse$files": "<p>The read set's files.</p>"
      }
    },
    "ReadSetFilter": {
      "base": "<p>A filter for read sets.</p>",
      "refs": {
        "ListReadSetsRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ReadSetId": {
      "base": null,
      "refs": {
        "ActivateReadSetSourceItem$readSetId": "<p>The source's read set ID.</p>",
        "CompleteMultipartReadSetUploadResponse$readSetId": "<p>The read set ID created for an uploaded read set.</p>",
        "ExportReadSet$readSetId": "<p>The set's ID.</p>",
        "ExportReadSetDetail$id": "<p>The set's ID.</p>",
        "GetReadSetMetadataRequest$id": "<p>The read set's ID.</p>",
        "GetReadSetMetadataResponse$id": "<p>The read set's ID.</p>",
        "GetReadSetRequest$id": "<p>The read set's ID.</p>",
        "ImportReadSetSourceItem$readSetId": "<p>The source's read set ID.</p>",
        "ReadSetBatchError$id": "<p>The error's ID.</p>",
        "ReadSetIdList$member": null,
        "ReadSetListItem$id": "<p>The read set's ID.</p>",
        "StartReadSetActivationJobSourceItem$readSetId": "<p>The source's read set ID.</p>"
      }
    },
    "ReadSetIdList": {
      "base": null,
      "refs": {
        "BatchDeleteReadSetRequest$ids": "<p>The read sets' IDs.</p>"
      }
    },
    "ReadSetImportJobItemStatus": {
      "base": null,
      "refs": {
        "ImportReadSetSourceItem$status": "<p>The source's status.</p>"
      }
    },
    "ReadSetImportJobStatus": {
      "base": null,
      "refs": {
        "GetReadSetImportJobResponse$status": "<p>The job's status.</p>",
        "ImportReadSetFilter$status": "<p>A status to filter on.</p>",
        "ImportReadSetJobItem$status": "<p>The job's status.</p>",
        "StartReadSetImportJobResponse$status": "<p>The job's status.</p>"
      }
    },
    "ReadSetList": {
      "base": null,
      "refs": {
        "ListReadSetsResponse$readSets": "<p>A list of read sets.</p>"
      }
    },
    "ReadSetListItem": {
      "base": "<p>A read set.</p>",
      "refs": {
        "ReadSetList$member": null
      }
    },
    "ReadSetName": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$name": "<p>The name of the read set.</p>",
        "CreateMultipartReadSetUploadResponse$name": "<p>The name of the read set.</p>",
        "GetReadSetMetadataResponse$name": "<p>The read set's name.</p>",
        "ImportReadSetSourceItem$name": "<p>The source's name.</p>",
        "MultipartReadSetUploadListItem$name": "<p> The name of a read set. </p>",
        "ReadSetFilter$name": "<p>A name to filter on.</p>",
        "ReadSetListItem$name": "<p>The read set's name.</p>",
        "StartReadSetImportJobSourceItem$name": "<p>The source's name.</p>"
      }
    },
    "ReadSetPartSource": {
      "base": null,
      "refs": {
        "CompleteReadSetUploadPartListItem$partSource": "<p> The source file of the part being uploaded. </p>",
        "ListReadSetUploadPartsRequest$partSource": "<p>The source file for the upload part.</p>",
        "ReadSetUploadPartListItem$partSource": "<p> The origin of the part being direct uploaded. </p>",
        "UploadReadSetPartRequest$partSource": "<p>The source file for an upload part.</p>"
      }
    },
    "ReadSetPartStreamingBlob": {
      "base": null,
      "refs": {
        "UploadReadSetPartRequest$payload": "<p>The read set data to upload for a part.</p>"
      }
    },
    "ReadSetS3Access": {
      "base": "<p>The S3 URI for each read set file.</p>",
      "refs": {
        "FileInformation$s3Access": "<p>The S3 URI metadata of a sequence store.</p>"
      }
    },
    "ReadSetStatus": {
      "base": null,
      "refs": {
        "GetReadSetMetadataResponse$status": "<p>The read set's status.</p>",
        "ReadSetFilter$status": "<p>A status to filter on.</p>",
        "ReadSetListItem$status": "<p>The read set's status.</p>"
      }
    },
    "ReadSetStatusMessage": {
      "base": null,
      "refs": {
        "GetReadSetMetadataResponse$statusMessage": "<p>The status message for a read set. It provides more detail as to why the read set has a status. </p>",
        "ReadSetListItem$statusMessage": "<p> The status for a read set. It provides more detail as to why the read set has a status. </p>"
      }
    },
    "ReadSetStreamingBlob": {
      "base": null,
      "refs": {
        "GetReadSetResponse$payload": "<p>The read set file payload.</p>"
      }
    },
    "ReadSetUploadPartList": {
      "base": null,
      "refs": {
        "ListReadSetUploadPartsResponse$parts": "<p>An array of upload parts.</p>"
      }
    },
    "ReadSetUploadPartListFilter": {
      "base": "<p> Filter settings that select for read set upload parts of interest. </p>",
      "refs": {
        "ListReadSetUploadPartsRequest$filter": "<p>Attributes used to filter for a specific subset of read set part uploads.</p>"
      }
    },
    "ReadSetUploadPartListItem": {
      "base": "<p> The metadata of a single part of a file that was added to a multipart upload. A list of these parts is returned in the response to the ListReadSetUploadParts API. </p>",
      "refs": {
        "ReadSetUploadPartList$member": null
      }
    },
    "ReadSetUploadPartListItemPartNumberInteger": {
      "base": null,
      "refs": {
        "ReadSetUploadPartListItem$partNumber": "<p> The number identifying the part in an upload. </p>"
      }
    },
    "ReadSetUploadPartListItemPartSizeLong": {
      "base": null,
      "refs": {
        "ReadSetUploadPartListItem$partSize": "<p> The size of the the part in an upload. </p>"
      }
    },
    "ReadmeMarkdown": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$readmeMarkdown": "<p>The markdown content for the workflow's README file. This provides documentation and usage information for users of the workflow.</p>",
        "CreateWorkflowVersionRequest$readmeMarkdown": "<p>The markdown content for the workflow version's README file. This provides documentation and usage information for users of this specific workflow version.</p>",
        "UpdateWorkflowRequest$readmeMarkdown": "<p>The markdown content for the workflow's README file. This provides documentation and usage information for users of the workflow.</p>",
        "UpdateWorkflowVersionRequest$readmeMarkdown": "<p>The markdown content for the workflow version's README file. This provides documentation and usage information for users of this specific workflow version.</p>"
      }
    },
    "ReadmePath": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$readmePath": "<p>The path to the workflow README markdown file within the repository. This file provides documentation and usage information for the workflow. If not specified, the <code>README.md</code> file from the root directory of the repository will be used.</p>",
        "CreateWorkflowVersionRequest$readmePath": "<p>The path to the workflow version README markdown file within the repository. This file provides documentation and usage information for the workflow. If not specified, the <code>README.md</code> file from the root directory of the repository will be used.</p>",
        "GetWorkflowResponse$readmePath": "<p>The path to the workflow README markdown file within the repository. This file provides documentation and usage information for the workflow. If not specified, the <code>README.md</code> file from the root directory of the repository will be used.</p>",
        "GetWorkflowVersionResponse$readmePath": "<p>The path to the workflow version README markdown file within the repository. This file provides documentation and usage information for the workflow. If not specified, the <code>README.md</code> file from the root directory of the repository will be used.</p>"
      }
    },
    "ReadmeS3PresignedUrl": {
      "base": null,
      "refs": {
        "GetWorkflowResponse$readme": "<p>The README content for the workflow, providing documentation and usage information.</p>",
        "GetWorkflowVersionResponse$readme": "<p>The README content for the workflow version, providing documentation and usage information specific to this version.</p>"
      }
    },
    "ReferenceArn": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$referenceArn": "<p>The ARN of the reference.</p>",
        "CreateMultipartReadSetUploadResponse$referenceArn": "<p>The read set source's reference ARN.</p>",
        "GetReadSetMetadataResponse$referenceArn": "<p>The read set's genome reference ARN.</p>",
        "GetReferenceMetadataResponse$arn": "<p>The reference's ARN.</p>",
        "ImportReadSetSourceItem$referenceArn": "<p>The source's genome reference ARN.</p>",
        "MultipartReadSetUploadListItem$referenceArn": "<p> The source's reference ARN. </p>",
        "ReadSetListItem$referenceArn": "<p>The read set's genome reference ARN.</p>",
        "ReferenceItem$referenceArn": "<p>The reference's ARN.</p>",
        "ReferenceListItem$arn": "<p>The reference's ARN.</p>",
        "StartReadSetImportJobSourceItem$referenceArn": "<p>The source's reference ARN.</p>"
      }
    },
    "ReferenceArnFilter": {
      "base": null,
      "refs": {
        "ReadSetFilter$referenceArn": "<p>A genome reference ARN to filter on.</p>"
      }
    },
    "ReferenceCreationType": {
      "base": null,
      "refs": {
        "GetReferenceMetadataResponse$creationType": "<p>The reference's creation type.</p>"
      }
    },
    "ReferenceDescription": {
      "base": null,
      "refs": {
        "GetReferenceMetadataResponse$description": "<p>The reference's description.</p>",
        "ImportReferenceSourceItem$description": "<p>The source's description.</p>",
        "ReferenceListItem$description": "<p>The reference's description.</p>",
        "StartReferenceImportJobSourceItem$description": "<p>The source's description.</p>"
      }
    },
    "ReferenceFile": {
      "base": null,
      "refs": {
        "GetReferenceRequest$file": "<p>The file to retrieve.</p>"
      }
    },
    "ReferenceFiles": {
      "base": "<p>A set of genome reference files.</p>",
      "refs": {
        "GetReferenceMetadataResponse$files": "<p>The reference's files.</p>"
      }
    },
    "ReferenceFilter": {
      "base": "<p>A filter for references.</p>",
      "refs": {
        "ListReferencesRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ReferenceId": {
      "base": null,
      "refs": {
        "DeleteReferenceRequest$id": "<p>The reference's ID.</p>",
        "GetReferenceMetadataRequest$id": "<p>The reference's ID.</p>",
        "GetReferenceMetadataResponse$id": "<p>The reference's ID.</p>",
        "GetReferenceRequest$id": "<p>The reference's ID.</p>",
        "ImportReferenceSourceItem$referenceId": "<p>The source's reference ID.</p>",
        "ReferenceListItem$id": "<p>The reference's ID.</p>"
      }
    },
    "ReferenceImportJobItemStatus": {
      "base": null,
      "refs": {
        "ImportReferenceSourceItem$status": "<p>The source's status.</p>"
      }
    },
    "ReferenceImportJobStatus": {
      "base": null,
      "refs": {
        "GetReferenceImportJobResponse$status": "<p>The job's status.</p>",
        "ImportReferenceFilter$status": "<p>A status to filter on.</p>",
        "ImportReferenceJobItem$status": "<p>The job's status.</p>",
        "StartReferenceImportJobResponse$status": "<p>The job's status.</p>"
      }
    },
    "ReferenceItem": {
      "base": "<p>A genome reference.</p>",
      "refs": {
        "AnnotationStoreItem$reference": "<p>The store's genome reference.</p>",
        "CreateAnnotationStoreRequest$reference": "<p>The genome reference for the store's annotations.</p>",
        "CreateAnnotationStoreResponse$reference": "<p>The store's genome reference. Required for all stores except TSV format with generic annotations.</p>",
        "CreateVariantStoreRequest$reference": "<p>The genome reference for the store's variants.</p>",
        "CreateVariantStoreResponse$reference": "<p>The store's genome reference.</p>",
        "GetAnnotationStoreResponse$reference": "<p>The store's genome reference.</p>",
        "GetVariantStoreResponse$reference": "<p>The store's genome reference.</p>",
        "UpdateAnnotationStoreResponse$reference": "<p>The store's genome reference.</p>",
        "UpdateVariantStoreResponse$reference": "<p>The store's genome reference.</p>",
        "VariantStoreItem$reference": "<p>The store's genome reference.</p>"
      }
    },
    "ReferenceList": {
      "base": null,
      "refs": {
        "ListReferencesResponse$references": "<p>A list of references.</p>"
      }
    },
    "ReferenceListItem": {
      "base": "<p>A genome reference.</p>",
      "refs": {
        "ReferenceList$member": null
      }
    },
    "ReferenceName": {
      "base": null,
      "refs": {
        "GetReferenceMetadataResponse$name": "<p>The reference's name.</p>",
        "ImportReferenceSourceItem$name": "<p>The source's name.</p>",
        "ReferenceFilter$name": "<p>A name to filter on.</p>",
        "ReferenceListItem$name": "<p>The reference's name.</p>",
        "StartReferenceImportJobSourceItem$name": "<p>The source's name.</p>"
      }
    },
    "ReferenceStatus": {
      "base": null,
      "refs": {
        "GetReferenceMetadataResponse$status": "<p>The reference's status.</p>",
        "ReferenceListItem$status": "<p>The reference's status.</p>"
      }
    },
    "ReferenceStoreArn": {
      "base": null,
      "refs": {
        "CreateReferenceStoreResponse$arn": "<p>The store's ARN.</p>",
        "GetReferenceStoreResponse$arn": "<p>The store's ARN.</p>",
        "ReferenceStoreDetail$arn": "<p>The store's ARN.</p>"
      }
    },
    "ReferenceStoreDescription": {
      "base": null,
      "refs": {
        "CreateReferenceStoreRequest$description": "<p>A description for the store.</p>",
        "CreateReferenceStoreResponse$description": "<p>The store's description.</p>",
        "GetReferenceStoreResponse$description": "<p>The store's description.</p>",
        "ReferenceStoreDetail$description": "<p>The store's description.</p>"
      }
    },
    "ReferenceStoreDetail": {
      "base": "<p>Details about a reference store.</p>",
      "refs": {
        "ReferenceStoreDetailList$member": null
      }
    },
    "ReferenceStoreDetailList": {
      "base": null,
      "refs": {
        "ListReferenceStoresResponse$referenceStores": "<p>A list of reference stores.</p>"
      }
    },
    "ReferenceStoreFilter": {
      "base": "<p>A filter for reference stores.</p>",
      "refs": {
        "ListReferenceStoresRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "ReferenceStoreId": {
      "base": null,
      "refs": {
        "CreateReferenceStoreResponse$id": "<p>The store's ID.</p>",
        "DeleteReferenceRequest$referenceStoreId": "<p>The reference's store ID.</p>",
        "DeleteReferenceStoreRequest$id": "<p>The store's ID.</p>",
        "GetReferenceImportJobRequest$referenceStoreId": "<p>The job's reference store ID.</p>",
        "GetReferenceImportJobResponse$referenceStoreId": "<p>The job's reference store ID.</p>",
        "GetReferenceMetadataRequest$referenceStoreId": "<p>The reference's reference store ID.</p>",
        "GetReferenceMetadataResponse$referenceStoreId": "<p>The reference's reference store ID.</p>",
        "GetReferenceRequest$referenceStoreId": "<p>The reference's store ID.</p>",
        "GetReferenceStoreRequest$id": "<p>The store's ID.</p>",
        "GetReferenceStoreResponse$id": "<p>The store's ID.</p>",
        "ImportReferenceJobItem$referenceStoreId": "<p>The job's reference store ID.</p>",
        "ListReferenceImportJobsRequest$referenceStoreId": "<p>The job's reference store ID.</p>",
        "ListReferencesRequest$referenceStoreId": "<p>The references' reference store ID.</p>",
        "ReferenceListItem$referenceStoreId": "<p>The reference's store ID.</p>",
        "ReferenceStoreDetail$id": "<p>The store's ID.</p>",
        "StartReferenceImportJobRequest$referenceStoreId": "<p>The job's reference store ID.</p>",
        "StartReferenceImportJobResponse$referenceStoreId": "<p>The job's reference store ID.</p>"
      }
    },
    "ReferenceStoreName": {
      "base": null,
      "refs": {
        "CreateReferenceStoreRequest$name": "<p>A name for the store.</p>",
        "CreateReferenceStoreResponse$name": "<p>The store's name.</p>",
        "GetReferenceStoreResponse$name": "<p>The store's name.</p>",
        "ReferenceStoreDetail$name": "<p>The store's name.</p>",
        "ReferenceStoreFilter$name": "<p>The name to filter on.</p>"
      }
    },
    "ReferenceStreamingBlob": {
      "base": null,
      "refs": {
        "GetReferenceResponse$payload": "<p>The reference file payload.</p>"
      }
    },
    "RegistryMapping": {
      "base": "<p>If you are using the ECR pull through cache feature, the registry mapping maps between the ECR repository and the upstream registry where container images are pulled and synchronized.</p>",
      "refs": {
        "RegistryMappingsList$member": null
      }
    },
    "RegistryMappingsList": {
      "base": null,
      "refs": {
        "ContainerRegistryMap$registryMappings": "<p>Mapping that provides the ECR repository path where upstream container images are pulled and synchronized.</p>"
      }
    },
    "RequestTimeoutException": {
      "base": "<p>The request timed out.</p>",
      "refs": {
      }
    },
    "ResourceId": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$id": "<p>The store's ID.</p>",
        "AnnotationStoreVersionItem$storeId": "<p> The store ID for an annotation store version. </p>",
        "AnnotationStoreVersionItem$id": "<p> The annotation store version ID. </p>",
        "CancelAnnotationImportRequest$jobId": "<p>The job's ID.</p>",
        "CancelVariantImportRequest$jobId": "<p>The job's ID.</p>",
        "CreateAnnotationStoreResponse$id": "<p>The store's ID.</p>",
        "CreateAnnotationStoreVersionResponse$id": "<p> A generated ID for the annotation store </p>",
        "CreateAnnotationStoreVersionResponse$storeId": "<p> The ID for the annotation store from which new versions are being created. </p>",
        "CreateVariantStoreResponse$id": "<p>The store's ID.</p>",
        "GetAnnotationImportRequest$jobId": "<p>The job's ID.</p>",
        "GetAnnotationImportResponse$id": "<p>The job's ID.</p>",
        "GetAnnotationStoreResponse$id": "<p>The store's ID.</p>",
        "GetAnnotationStoreVersionResponse$storeId": "<p> The store ID for annotation store version. </p>",
        "GetAnnotationStoreVersionResponse$id": "<p> The annotation store version ID. </p>",
        "GetVariantImportRequest$jobId": "<p>The job's ID.</p>",
        "GetVariantImportResponse$id": "<p>The job's ID.</p>",
        "GetVariantStoreResponse$id": "<p>The store's ID.</p>",
        "StartAnnotationImportResponse$jobId": "<p>The job's ID.</p>",
        "StartVariantImportResponse$jobId": "<p>The job's ID.</p>",
        "UpdateAnnotationStoreResponse$id": "<p>The store's ID.</p>",
        "UpdateAnnotationStoreVersionResponse$storeId": "<p> The annotation store ID. </p>",
        "UpdateAnnotationStoreVersionResponse$id": "<p> The annotation store version ID. </p>",
        "UpdateVariantStoreResponse$id": "<p>The store's ID.</p>",
        "VariantStoreItem$id": "<p>The store's ID.</p>"
      }
    },
    "ResourceIdentifier": {
      "base": null,
      "refs": {
        "ListAnnotationImportJobsRequestIdsList$member": null,
        "ListAnnotationStoresRequestIdsList$member": null,
        "ListVariantImportJobsRequestIdsList$member": null,
        "ListVariantStoresRequestIdsList$member": null
      }
    },
    "ResourceNotFoundException": {
      "base": "<p>The target resource was not found in the current Region.</p>",
      "refs": {
      }
    },
    "ResourceOwner": {
      "base": null,
      "refs": {
        "ListSharesRequest$resourceOwner": "<p>The account that owns the resource shares.</p>"
      }
    },
    "RoleArn": {
      "base": null,
      "refs": {
        "GetReadSetImportJobResponse$roleArn": "<p>The job's service role ARN.</p>",
        "GetReferenceImportJobResponse$roleArn": "<p>The job's service role ARN.</p>",
        "ImportReadSetJobItem$roleArn": "<p>The job's service role ARN.</p>",
        "ImportReferenceJobItem$roleArn": "<p>The job's service role ARN.</p>",
        "StartReadSetExportJobRequest$roleArn": "<p>A service role for the job.</p>",
        "StartReadSetImportJobRequest$roleArn": "<p>A service role for the job.</p>",
        "StartReadSetImportJobResponse$roleArn": "<p>The job's service role ARN.</p>",
        "StartReferenceImportJobRequest$roleArn": "<p>A service role for the job.</p>",
        "StartReferenceImportJobResponse$roleArn": "<p>The job's service role ARN.</p>"
      }
    },
    "RunArn": {
      "base": null,
      "refs": {
        "GetRunResponse$arn": "<p>The run's ARN.</p>",
        "RunListItem$arn": "<p>The run's ARN.</p>",
        "StartRunResponse$arn": "<p>Unique resource identifier for the run.</p>"
      }
    },
    "RunCacheArn": {
      "base": null,
      "refs": {
        "CreateRunCacheResponse$arn": "<p>Unique resource identifier for the run cache.</p>",
        "GetRunCacheResponse$arn": "<p>Unique resource identifier for the run cache.</p>",
        "RunCacheListItem$arn": "<p>Unique resource identifier for the run cache.</p>"
      }
    },
    "RunCacheId": {
      "base": null,
      "refs": {
        "CreateRunCacheResponse$id": "<p>Identifier for the run cache.</p>",
        "DeleteRunCacheRequest$id": "<p>Run cache identifier for the cache you want to delete.</p>",
        "GetRunCacheRequest$id": "<p>The identifier of the run cache to retrieve.</p>",
        "GetRunCacheResponse$id": "<p>The run cache ID.</p>",
        "RunCacheListItem$id": "<p>The identifier for this run cache.</p>",
        "UpdateRunCacheRequest$id": "<p>The identifier of the run cache you want to update.</p>"
      }
    },
    "RunCacheList": {
      "base": null,
      "refs": {
        "ListRunCachesResponse$items": "<p>Details about each run cache in the response.</p>"
      }
    },
    "RunCacheListItem": {
      "base": "<p>List entry for one run cache.</p>",
      "refs": {
        "RunCacheList$member": null
      }
    },
    "RunCacheRequestId": {
      "base": null,
      "refs": {
        "CreateRunCacheRequest$requestId": "<p>A unique request token, to ensure idempotency. If you don't specify a token, Amazon Web Services HealthOmics automatically generates a universally unique identifier (UUID) for the request.</p>"
      }
    },
    "RunCacheStatus": {
      "base": null,
      "refs": {
        "CreateRunCacheResponse$status": "<p>Run cache status.</p>",
        "GetRunCacheResponse$status": "<p>The run cache status.</p>",
        "RunCacheListItem$status": "<p>The run cache status.</p>"
      }
    },
    "RunCacheTimestamp": {
      "base": null,
      "refs": {
        "GetRunCacheResponse$creationTime": "<p>Creation time of the run cache (an ISO 8601 formatted string).</p>",
        "RunCacheListItem$creationTime": "<p>The time that this run cache was created (an ISO 8601 formatted string).</p>"
      }
    },
    "RunExport": {
      "base": null,
      "refs": {
        "RunExportList$member": null
      }
    },
    "RunExportList": {
      "base": null,
      "refs": {
        "GetRunRequest$export": "<p>The run's export format.</p>"
      }
    },
    "RunFailureReason": {
      "base": null,
      "refs": {
        "GetRunResponse$failureReason": "<p>The reason a run has failed.</p>"
      }
    },
    "RunGroupArn": {
      "base": null,
      "refs": {
        "CreateRunGroupResponse$arn": "<p>The group's ARN.</p>",
        "GetRunGroupResponse$arn": "<p>The group's ARN.</p>",
        "RunGroupListItem$arn": "<p>The group's ARN.</p>"
      }
    },
    "RunGroupId": {
      "base": null,
      "refs": {
        "CreateRunGroupResponse$id": "<p>The group's ID.</p>",
        "DeleteRunGroupRequest$id": "<p>The run group's ID.</p>",
        "GetRunGroupRequest$id": "<p>The group's ID.</p>",
        "GetRunGroupResponse$id": "<p>The group's ID.</p>",
        "GetRunResponse$runGroupId": "<p>The run's group ID.</p>",
        "ListRunsRequest$runGroupId": "<p>Filter the list by run group ID.</p>",
        "RunGroupListItem$id": "<p>The group's ID.</p>",
        "StartRunRequest$runGroupId": "<p>The run's group ID. Use a run group to cap the compute resources (and number of concurrent runs) for the runs that you add to the run group.</p>",
        "UpdateRunGroupRequest$id": "<p>The group's ID.</p>"
      }
    },
    "RunGroupList": {
      "base": null,
      "refs": {
        "ListRunGroupsResponse$items": "<p>A list of groups.</p>"
      }
    },
    "RunGroupListItem": {
      "base": "<p>A run group.</p>",
      "refs": {
        "RunGroupList$member": null
      }
    },
    "RunGroupListItemMaxCpusInteger": {
      "base": null,
      "refs": {
        "RunGroupListItem$maxCpus": "<p>The group's maximum CPU count setting.</p>"
      }
    },
    "RunGroupListItemMaxDurationInteger": {
      "base": null,
      "refs": {
        "RunGroupListItem$maxDuration": "<p>The group's maximum duration setting in minutes.</p>"
      }
    },
    "RunGroupListItemMaxGpusInteger": {
      "base": null,
      "refs": {
        "RunGroupListItem$maxGpus": "<p> The maximum GPUs that can be used by a run group. </p>"
      }
    },
    "RunGroupListItemMaxRunsInteger": {
      "base": null,
      "refs": {
        "RunGroupListItem$maxRuns": "<p>The group's maximum concurrent run setting.</p>"
      }
    },
    "RunGroupListToken": {
      "base": null,
      "refs": {
        "ListRunGroupsRequest$startingToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListRunGroupsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "RunGroupName": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$name": "<p>A name for the group.</p>",
        "GetRunGroupResponse$name": "<p>The group's name.</p>",
        "ListRunGroupsRequest$name": "<p>The run groups' name.</p>",
        "RunGroupListItem$name": "<p>The group's name.</p>",
        "UpdateRunGroupRequest$name": "<p>A name for the group.</p>"
      }
    },
    "RunGroupRequestId": {
      "base": null,
      "refs": {
        "CreateRunGroupRequest$requestId": "<p>To ensure that requests don't run multiple times, specify a unique ID for each request.</p>"
      }
    },
    "RunGroupTimestamp": {
      "base": null,
      "refs": {
        "GetRunGroupResponse$creationTime": "<p>When the group was created.</p>",
        "RunGroupListItem$creationTime": "<p>When the group was created.</p>"
      }
    },
    "RunId": {
      "base": null,
      "refs": {
        "CancelRunRequest$id": "<p>The run's ID.</p>",
        "DeleteRunRequest$id": "<p>The run's ID.</p>",
        "GetRunRequest$id": "<p>The run's ID.</p>",
        "GetRunResponse$id": "<p>The run's ID.</p>",
        "GetRunResponse$runId": "<p>The run's ID.</p>",
        "GetRunTaskRequest$id": "<p>The workflow run ID.</p>",
        "ListRunTasksRequest$id": "<p>The run's ID.</p>",
        "RunListItem$id": "<p>The run's ID.</p>",
        "StartRunRequest$runId": "<p>The ID of a run to duplicate.</p>",
        "StartRunResponse$id": "<p>The run's ID.</p>"
      }
    },
    "RunLeftNormalization": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$runLeftNormalization": "<p>The job's left normalization setting.</p>",
        "GetAnnotationImportResponse$runLeftNormalization": "<p>The job's left normalization setting.</p>",
        "GetVariantImportResponse$runLeftNormalization": "<p>The job's left normalization setting.</p>",
        "StartAnnotationImportRequest$runLeftNormalization": "<p>The job's left normalization setting.</p>",
        "StartVariantImportRequest$runLeftNormalization": "<p>The job's left normalization setting.</p>",
        "VariantImportJobItem$runLeftNormalization": "<p>The job's left normalization setting.</p>"
      }
    },
    "RunList": {
      "base": null,
      "refs": {
        "ListRunsResponse$items": "<p>A list of runs.</p>"
      }
    },
    "RunListItem": {
      "base": "<p>A workflow run.</p>",
      "refs": {
        "RunList$member": null
      }
    },
    "RunListItemPriorityInteger": {
      "base": null,
      "refs": {
        "RunListItem$priority": "<p>The run's priority.</p>"
      }
    },
    "RunListItemStorageCapacityInteger": {
      "base": null,
      "refs": {
        "RunListItem$storageCapacity": "<p>The run's storage capacity in gibibytes. For dynamic storage, after the run has completed, this value is the maximum amount of storage used during the run.</p>"
      }
    },
    "RunListToken": {
      "base": null,
      "refs": {
        "ListRunsRequest$startingToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListRunsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "RunLogLevel": {
      "base": null,
      "refs": {
        "GetRunResponse$logLevel": "<p>The run's log level.</p>",
        "StartRunRequest$logLevel": "<p>A log level for the run.</p>"
      }
    },
    "RunLogLocation": {
      "base": "<p>The URI for the run log.</p>",
      "refs": {
        "GetRunResponse$logLocation": "<p>The location of the run log.</p>"
      }
    },
    "RunLogStream": {
      "base": null,
      "refs": {
        "RunLogLocation$runLogStream": "<p>The log stream ARN for the run log.</p>"
      }
    },
    "RunName": {
      "base": null,
      "refs": {
        "GetRunResponse$name": "<p>The run's name.</p>",
        "ListRunsRequest$name": "<p>Filter the list by run name.</p>",
        "RunListItem$name": "<p>The run's name.</p>",
        "StartRunRequest$name": "<p>A name for the run. This is recommended to view and organize runs in the Amazon Web Services HealthOmics console and CloudWatch logs.</p>"
      }
    },
    "RunOutputUri": {
      "base": null,
      "refs": {
        "GetRunResponse$outputUri": "<p>The run's output URI.</p>",
        "GetRunResponse$runOutputUri": "<p>The destination for workflow outputs.</p>",
        "StartRunRequest$outputUri": "<p>An output S3 URI for the run. The S3 bucket must be in the same region as the workflow. The role ARN must have permission to write to this S3 bucket.</p>",
        "StartRunResponse$runOutputUri": "<p>The destination for workflow outputs.</p>"
      }
    },
    "RunParameters": {
      "base": null,
      "refs": {
        "GetRunResponse$parameters": "<p>The run's parameters.</p>",
        "StartRunRequest$parameters": "<p>Parameters for the run. The run needs all required parameters and can include optional parameters. The run cannot include any parameters that are not defined in the parameter template. To retrieve parameters from the run, use the GetRun API operation.</p>"
      }
    },
    "RunRequestId": {
      "base": null,
      "refs": {
        "StartRunRequest$requestId": "<p>An idempotency token used to dedupe retry requests so that duplicate runs are not created.</p>"
      }
    },
    "RunResourceDigest": {
      "base": null,
      "refs": {
        "RunResourceDigests$value": null
      }
    },
    "RunResourceDigestKey": {
      "base": null,
      "refs": {
        "RunResourceDigests$key": null
      }
    },
    "RunResourceDigests": {
      "base": null,
      "refs": {
        "GetRunResponse$resourceDigests": "<p>The run's resource digests.</p>"
      }
    },
    "RunRetentionMode": {
      "base": null,
      "refs": {
        "GetRunResponse$retentionMode": "<p>The run's retention mode.</p>",
        "StartRunRequest$retentionMode": "<p>The retention mode for the run. The default value is <code>RETAIN</code>. </p> <p>Amazon Web Services HealthOmics stores a fixed number of runs that are available to the console and API. In the default mode (<code>RETAIN</code>), you need to remove runs manually when the number of run exceeds the maximum. If you set the retention mode to <code>REMOVE</code>, Amazon Web Services HealthOmics automatically removes runs (that have mode set to <code>REMOVE</code>) when the number of run exceeds the maximum. All run logs are available in CloudWatch logs, if you need information about a run that is no longer available to the API.</p> <p>For more information about retention mode, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/starting-a-run.html\">Specifying run retention mode</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>"
      }
    },
    "RunRoleArn": {
      "base": null,
      "refs": {
        "GetRunResponse$roleArn": "<p>The run's service role ARN.</p>",
        "StartRunRequest$roleArn": "<p>A service role for the run. The <code>roleArn</code> requires access to Amazon Web Services HealthOmics, S3, Cloudwatch logs, and EC2. An example <code>roleArn</code> is <code>arn:aws:iam::123456789012:role/omics-service-role-serviceRole-W8O1XMPL7QZ</code>. In this example, the AWS account ID is <code>123456789012</code> and the role name is <code>omics-service-role-serviceRole-W8O1XMPL7QZ</code>.</p>"
      }
    },
    "RunStartedBy": {
      "base": null,
      "refs": {
        "GetRunResponse$startedBy": "<p>Who started the run.</p>"
      }
    },
    "RunStatus": {
      "base": null,
      "refs": {
        "GetRunResponse$status": "<p>The run's status.</p>",
        "ListRunsRequest$status": "<p>The status of a run.</p>",
        "RunListItem$status": "<p>The run's status.</p>",
        "StartRunResponse$status": "<p>The run's status.</p>"
      }
    },
    "RunStatusMessage": {
      "base": null,
      "refs": {
        "GetRunResponse$statusMessage": "<p>The run's status message.</p>"
      }
    },
    "RunTimestamp": {
      "base": null,
      "refs": {
        "GetRunResponse$creationTime": "<p>When the run was created.</p>",
        "GetRunResponse$startTime": "<p>When the run started.</p>",
        "GetRunResponse$stopTime": "<p>The run's stop time.</p>",
        "RunListItem$creationTime": "<p>When the run was created.</p>",
        "RunListItem$startTime": "<p>When the run started.</p>",
        "RunListItem$stopTime": "<p>When the run stopped.</p>"
      }
    },
    "RunUuid": {
      "base": null,
      "refs": {
        "GetRunResponse$uuid": "<p>The universally unique identifier for a run.</p>",
        "StartRunResponse$uuid": "<p>The universally unique identifier for a run.</p>"
      }
    },
    "S3AccessConfig": {
      "base": "<p>S3 access configuration parameters.</p>",
      "refs": {
        "CreateSequenceStoreRequest$s3AccessConfig": "<p>S3 access configuration parameters. This specifies the parameters needed to access logs stored in S3 buckets. The S3 bucket must be in the same region and account as the sequence store. </p>",
        "UpdateSequenceStoreRequest$s3AccessConfig": "<p>S3 access configuration parameters.</p>"
      }
    },
    "S3AccessPointArn": {
      "base": null,
      "refs": {
        "DeleteS3AccessPolicyRequest$s3AccessPointArn": "<p>The S3 access point ARN that has the access policy.</p>",
        "GetS3AccessPolicyRequest$s3AccessPointArn": "<p>The S3 access point ARN that has the access policy.</p>",
        "GetS3AccessPolicyResponse$s3AccessPointArn": "<p>The S3 access point ARN that has the access policy.</p>",
        "PutS3AccessPolicyRequest$s3AccessPointArn": "<p>The S3 access point ARN where you want to put the access policy.</p>",
        "PutS3AccessPolicyResponse$s3AccessPointArn": "<p>The S3 access point ARN that now has the access policy.</p>",
        "SequenceStoreS3Access$s3AccessPointArn": "<p>This is ARN of the access point associated with the S3 bucket storing read sets.</p>"
      }
    },
    "S3AccessPolicy": {
      "base": null,
      "refs": {
        "GetS3AccessPolicyResponse$s3AccessPolicy": "<p>The current resource policy that controls S3 access on the store.</p>",
        "PutS3AccessPolicyRequest$s3AccessPolicy": "<p>The resource policy that controls S3 access to the store.</p>"
      }
    },
    "S3Destination": {
      "base": null,
      "refs": {
        "ExportReadSetJobDetail$destination": "<p>The job's destination in Amazon S3.</p>",
        "GetReadSetExportJobResponse$destination": "<p>The job's destination in Amazon S3.</p>",
        "StartReadSetExportJobRequest$destination": "<p>A location for exported files in Amazon S3.</p>",
        "StartReadSetExportJobResponse$destination": "<p>The job's output location.</p>"
      }
    },
    "S3Uri": {
      "base": null,
      "refs": {
        "AnnotationImportItemDetail$source": "<p>The source file's location in Amazon S3.</p>",
        "AnnotationImportItemSource$source": "<p>The source file's location in Amazon S3.</p>",
        "ImportReferenceSourceItem$sourceFile": "<p>The source file's location in Amazon S3.</p>",
        "ReadSetS3Access$s3Uri": "<p>The S3 URI for each read set file.</p>",
        "SequenceStoreS3Access$s3Uri": "<p>The S3 URI of the sequence store.</p>",
        "SourceFiles$source1": "<p>The location of the first file in Amazon S3.</p>",
        "SourceFiles$source2": "<p>The location of the second file in Amazon S3.</p>",
        "StartReferenceImportJobSourceItem$sourceFile": "<p>The source file's location in Amazon S3.</p>",
        "VariantImportItemDetail$source": "<p>The source file's location in Amazon S3.</p>",
        "VariantImportItemSource$source": "<p>The source file's location in Amazon S3.</p>"
      }
    },
    "S3UriForBucketOrObject": {
      "base": "<p>Uri to a S3 object or bucket</p>",
      "refs": {
        "CreateRunCacheRequest$cacheS3Location": "<p>Specify the S3 location for storing the cached task outputs. This data must be immediately accessible (not in an archived state).</p>",
        "GetRunCacheResponse$cacheS3Uri": "<p>The S3 URI where the cache data is stored.</p>",
        "GetRunTaskResponse$cacheS3Uri": "<p>The S3 URI of the cache location.</p>",
        "RunCacheListItem$cacheS3Uri": "<p>The S3 uri for the run cache data.</p>",
        "TaskListItem$cacheS3Uri": "<p>The S3 URI of the cache location.</p>"
      }
    },
    "S3UriForObject": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$readmeUri": "<p>The S3 URI of the README file for the workflow. This file provides documentation and usage information for the workflow. Requirements include:</p> <ul> <li> <p>The S3 URI must begin with <code>s3://USER-OWNED-BUCKET/</code> </p> </li> <li> <p>The requester must have access to the S3 bucket and object.</p> </li> <li> <p>The max README content length is 500 KiB.</p> </li> </ul>",
        "CreateWorkflowVersionRequest$readmeUri": "<p>The S3 URI of the README file for the workflow version. This file provides documentation and usage information for the workflow version. Requirements include:</p> <ul> <li> <p>The S3 URI must begin with <code>s3://USER-OWNED-BUCKET/</code> </p> </li> <li> <p>The requester must have access to the S3 bucket and object.</p> </li> <li> <p>The max README content length is 500 KiB.</p> </li> </ul>"
      }
    },
    "SampleId": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$sampleId": "<p>The source's sample ID.</p>",
        "CreateMultipartReadSetUploadResponse$sampleId": "<p>The source's sample ID.</p>",
        "GetReadSetMetadataResponse$sampleId": "<p>The read set's sample ID.</p>",
        "ImportReadSetSourceItem$sampleId": "<p>The source's sample ID.</p>",
        "MultipartReadSetUploadListItem$sampleId": "<p> The read set source's sample ID. </p>",
        "ReadSetFilter$sampleId": "<p> The read set source's sample ID. </p>",
        "ReadSetListItem$sampleId": "<p>The read set's sample ID.</p>",
        "StartReadSetImportJobSourceItem$sampleId": "<p>The source's sample ID.</p>"
      }
    },
    "SchemaItem": {
      "base": null,
      "refs": {
        "TsvStoreOptionsSchemaList$member": null,
        "TsvVersionOptionsSchemaList$member": null
      }
    },
    "SchemaItemKeyString": {
      "base": null,
      "refs": {
        "SchemaItem$key": null
      }
    },
    "SchemaValueType": {
      "base": null,
      "refs": {
        "SchemaItem$value": null
      }
    },
    "Separator": {
      "base": null,
      "refs": {
        "ReadOptions$sep": "<p>The file's field separator.</p>"
      }
    },
    "SequenceInformation": {
      "base": "<p>Details about a sequence.</p>",
      "refs": {
        "GetReadSetMetadataResponse$sequenceInformation": "<p>The read set's sequence information.</p>",
        "ReadSetListItem$sequenceInformation": null
      }
    },
    "SequenceStoreArn": {
      "base": null,
      "refs": {
        "CreateSequenceStoreResponse$arn": "<p>The store's ARN.</p>",
        "GetSequenceStoreResponse$arn": "<p>The store's ARN.</p>",
        "SequenceStoreDetail$arn": "<p>The store's ARN.</p>",
        "UpdateSequenceStoreResponse$arn": "<p>The ARN of the sequence store.</p>"
      }
    },
    "SequenceStoreDescription": {
      "base": null,
      "refs": {
        "CreateSequenceStoreRequest$description": "<p>A description for the store.</p>",
        "CreateSequenceStoreResponse$description": "<p>The store's description.</p>",
        "GetSequenceStoreResponse$description": "<p>The store's description.</p>",
        "SequenceStoreDetail$description": "<p>The store's description.</p>",
        "UpdateSequenceStoreRequest$description": "<p>A description for the sequence store.</p>",
        "UpdateSequenceStoreResponse$description": "<p>Description of the sequence store.</p>"
      }
    },
    "SequenceStoreDetail": {
      "base": "<p>Details about a sequence store.</p>",
      "refs": {
        "SequenceStoreDetailList$member": null
      }
    },
    "SequenceStoreDetailList": {
      "base": null,
      "refs": {
        "ListSequenceStoresResponse$sequenceStores": "<p>A list of sequence stores.</p>"
      }
    },
    "SequenceStoreFilter": {
      "base": "<p>A filter for a sequence store.</p>",
      "refs": {
        "ListSequenceStoresRequest$filter": "<p>A filter to apply to the list.</p>"
      }
    },
    "SequenceStoreId": {
      "base": null,
      "refs": {
        "AbortMultipartReadSetUploadRequest$sequenceStoreId": "<p>The sequence store ID for the store involved in the multipart upload.</p>",
        "ActivateReadSetJobItem$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "BatchDeleteReadSetRequest$sequenceStoreId": "<p>The read sets' sequence store ID.</p>",
        "CompleteMultipartReadSetUploadRequest$sequenceStoreId": "<p>The sequence store ID for the store involved in the multipart upload.</p>",
        "CreateMultipartReadSetUploadRequest$sequenceStoreId": "<p>The sequence store ID for the store that is the destination of the multipart uploads.</p>",
        "CreateMultipartReadSetUploadResponse$sequenceStoreId": "<p>The sequence store ID for the store that the read set will be created in.</p>",
        "CreateSequenceStoreResponse$id": "<p>The store's ID.</p>",
        "DeleteSequenceStoreRequest$id": "<p>The sequence store's ID.</p>",
        "ExportReadSetJobDetail$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetActivationJobRequest$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetActivationJobResponse$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetExportJobRequest$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetExportJobResponse$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetImportJobRequest$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetImportJobResponse$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "GetReadSetMetadataRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "GetReadSetMetadataResponse$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "GetReadSetRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "GetSequenceStoreRequest$id": "<p>The store's ID.</p>",
        "GetSequenceStoreResponse$id": "<p>The store's ID.</p>",
        "ImportReadSetJobItem$sequenceStoreId": "<p>The job's sequence store ID.</p>",
        "ListMultipartReadSetUploadsRequest$sequenceStoreId": "<p>The Sequence Store ID used for the multipart uploads.</p>",
        "ListReadSetActivationJobsRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "ListReadSetExportJobsRequest$sequenceStoreId": "<p>The jobs' sequence store ID.</p>",
        "ListReadSetImportJobsRequest$sequenceStoreId": "<p>The jobs' sequence store ID.</p>",
        "ListReadSetUploadPartsRequest$sequenceStoreId": "<p>The Sequence Store ID used for the multipart uploads.</p>",
        "ListReadSetsRequest$sequenceStoreId": "<p>The jobs' sequence store ID.</p>",
        "MultipartReadSetUploadListItem$sequenceStoreId": "<p> The sequence store ID used for the multipart upload. </p>",
        "ReadSetListItem$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "SequenceStoreDetail$id": "<p>The store's ID.</p>",
        "StartReadSetActivationJobRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "StartReadSetActivationJobResponse$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "StartReadSetExportJobRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "StartReadSetExportJobResponse$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "StartReadSetImportJobRequest$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "StartReadSetImportJobResponse$sequenceStoreId": "<p>The read set's sequence store ID.</p>",
        "UpdateSequenceStoreRequest$id": "<p>The ID of the sequence store.</p>",
        "UpdateSequenceStoreResponse$id": "<p>The ID of the sequence store.</p>",
        "UploadReadSetPartRequest$sequenceStoreId": "<p>The Sequence Store ID used for the multipart upload.</p>"
      }
    },
    "SequenceStoreName": {
      "base": null,
      "refs": {
        "CreateSequenceStoreRequest$name": "<p>A name for the store.</p>",
        "CreateSequenceStoreResponse$name": "<p>The store's name.</p>",
        "GetSequenceStoreResponse$name": "<p>The store's name.</p>",
        "SequenceStoreDetail$name": "<p>The store's name.</p>",
        "SequenceStoreFilter$name": "<p>A name to filter on.</p>",
        "UpdateSequenceStoreRequest$name": "<p>A name for the sequence store.</p>",
        "UpdateSequenceStoreResponse$name": "<p>The name of the sequence store.</p>"
      }
    },
    "SequenceStoreS3Access": {
      "base": "<p>The S3 access metadata of the sequence store.</p>",
      "refs": {
        "CreateSequenceStoreResponse$s3Access": null,
        "GetSequenceStoreResponse$s3Access": "<p>The S3 metadata of a sequence store, including the ARN and S3 URI of the S3 bucket.</p>",
        "UpdateSequenceStoreResponse$s3Access": null
      }
    },
    "SequenceStoreStatus": {
      "base": null,
      "refs": {
        "CreateSequenceStoreResponse$status": "<p>The status of the sequence store.</p>",
        "GetSequenceStoreResponse$status": "<p>The status of the sequence store.</p>",
        "SequenceStoreDetail$status": "<p>Status of the sequence store.</p>",
        "SequenceStoreFilter$status": "<p>Filter results based on status.</p>",
        "UpdateSequenceStoreResponse$status": "<p>The status of the sequence store.</p>"
      }
    },
    "SequenceStoreStatusMessage": {
      "base": null,
      "refs": {
        "CreateSequenceStoreResponse$statusMessage": "<p>The status message of the sequence store.</p>",
        "GetSequenceStoreResponse$statusMessage": "<p>The status message of the sequence store.</p>",
        "SequenceStoreDetail$statusMessage": "<p>The status message of the sequence store.</p>",
        "UpdateSequenceStoreResponse$statusMessage": "<p>The status message of the sequence store.</p>"
      }
    },
    "ServiceQuotaExceededException": {
      "base": "<p>The request exceeds a service quota.</p>",
      "refs": {
      }
    },
    "ShareDetails": {
      "base": "<p>The details of a resource share.</p>",
      "refs": {
        "GetShareResponse$share": "<p>A resource share details object. The object includes the status, the resourceArn, and ownerId.</p>",
        "ShareDetailsList$member": null
      }
    },
    "ShareDetailsList": {
      "base": null,
      "refs": {
        "ListSharesResponse$shares": "<p>The shares available and their metadata details.</p>"
      }
    },
    "ShareName": {
      "base": null,
      "refs": {
        "CreateShareRequest$shareName": "<p>A name that the owner defines for the share.</p>",
        "CreateShareResponse$shareName": "<p>The name of the share.</p>",
        "ShareDetails$shareName": "<p>The name of the resource share.</p>"
      }
    },
    "ShareResourceType": {
      "base": null,
      "refs": {
        "TypeList$member": null
      }
    },
    "ShareStatus": {
      "base": null,
      "refs": {
        "AcceptShareResponse$status": "<p>The status of the resource share.</p>",
        "CreateShareResponse$status": "<p>The status of the share.</p>",
        "DeleteShareResponse$status": "<p>The status of the share being deleted.</p>",
        "ShareDetails$status": "<p>The status of the share.</p>",
        "StatusList$member": null
      }
    },
    "SourceFiles": {
      "base": "<p>Source files for a sequence.</p>",
      "refs": {
        "ImportReadSetSourceItem$sourceFiles": "<p>The source files' location in Amazon S3.</p>",
        "StartReadSetImportJobSourceItem$sourceFiles": "<p>The source files' location in Amazon S3.</p>"
      }
    },
    "SourceReference": {
      "base": "<p>Contains information about the source reference in a code repository, such as a branch, tag, or commit.</p>",
      "refs": {
        "DefinitionRepository$sourceReference": "<p>The source reference for the repository, such as a branch name, tag, or commit ID.</p>",
        "DefinitionRepositoryDetails$sourceReference": "<p>The source reference for the repository, such as a branch name, tag, or commit ID.</p>"
      }
    },
    "SourceReferenceType": {
      "base": null,
      "refs": {
        "SourceReference$type": "<p>The type of source reference, such as branch, tag, or commit.</p>"
      }
    },
    "SourceReferenceValue": {
      "base": null,
      "refs": {
        "SourceReference$value": "<p>The value of the source reference, such as the branch name, tag name, or commit ID.</p>"
      }
    },
    "SseConfig": {
      "base": "<p>Server-side encryption (SSE) settings for a store.</p>",
      "refs": {
        "AnnotationStoreItem$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "CreateAnnotationStoreRequest$sseConfig": "<p>Server-side encryption (SSE) settings for the store.</p>",
        "CreateReferenceStoreRequest$sseConfig": "<p>Server-side encryption (SSE) settings for the store.</p>",
        "CreateReferenceStoreResponse$sseConfig": "<p>The store's SSE settings.</p>",
        "CreateSequenceStoreRequest$sseConfig": "<p>Server-side encryption (SSE) settings for the store.</p>",
        "CreateSequenceStoreResponse$sseConfig": "<p>Server-side encryption (SSE) settings for the store. This contains the KMS key ARN that is used to encrypt read set objects.</p>",
        "CreateVariantStoreRequest$sseConfig": "<p>Server-side encryption (SSE) settings for the store.</p>",
        "GetAnnotationStoreResponse$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "GetReferenceStoreResponse$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "GetSequenceStoreResponse$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "GetVariantStoreResponse$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "ReferenceStoreDetail$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "SequenceStoreDetail$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>",
        "UpdateSequenceStoreResponse$sseConfig": null,
        "VariantStoreItem$sseConfig": "<p>The store's server-side encryption (SSE) settings.</p>"
      }
    },
    "SseConfigKeyArnString": {
      "base": null,
      "refs": {
        "SseConfig$keyArn": "<p>An encryption key ARN.</p>"
      }
    },
    "StartAnnotationImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartAnnotationImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetActivationJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetActivationJobRequestSourcesList": {
      "base": null,
      "refs": {
        "StartReadSetActivationJobRequest$sources": "<p>The job's source files.</p>"
      }
    },
    "StartReadSetActivationJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetActivationJobSourceItem": {
      "base": "<p>A source for a read set activation job.</p>",
      "refs": {
        "StartReadSetActivationJobRequestSourcesList$member": null
      }
    },
    "StartReadSetExportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetExportJobRequestSourcesList": {
      "base": null,
      "refs": {
        "StartReadSetExportJobRequest$sources": "<p>The job's source files.</p>"
      }
    },
    "StartReadSetExportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetImportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetImportJobRequestSourcesList": {
      "base": null,
      "refs": {
        "StartReadSetImportJobRequest$sources": "<p>The job's source files.</p>"
      }
    },
    "StartReadSetImportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartReadSetImportJobSourceItem": {
      "base": "<p>A source for a read set import job.</p>",
      "refs": {
        "StartReadSetImportJobRequestSourcesList$member": null
      }
    },
    "StartReferenceImportJobRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartReferenceImportJobRequestSourcesList": {
      "base": null,
      "refs": {
        "StartReferenceImportJobRequest$sources": "<p>The job's source files.</p>"
      }
    },
    "StartReferenceImportJobResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartReferenceImportJobSourceItem": {
      "base": "<p>A source for a reference import job.</p>",
      "refs": {
        "StartReferenceImportJobRequestSourcesList$member": null
      }
    },
    "StartRunRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartRunRequestPriorityInteger": {
      "base": null,
      "refs": {
        "StartRunRequest$priority": "<p>Use the run priority (highest: 1) to establish the order of runs in a run group when you start a run. If multiple runs share the same priority, the run that was initiated first will have the higher priority. Runs that do not belong to a run group can be assigned a priority. The priorities of these runs are ranked among other runs that are not in a run group. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/creating-run-groups.html#run-priority\">Run priority</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>"
      }
    },
    "StartRunRequestStorageCapacityInteger": {
      "base": null,
      "refs": {
        "StartRunRequest$storageCapacity": "<p>The <code>STATIC</code> storage capacity (in gibibytes, GiB) for this run. The default run storage capacity is 1200 GiB. If your requested storage capacity is unavailable, the system rounds up the value to the nearest 1200 GiB multiple. If the requested storage capacity is still unavailable, the system rounds up the value to the nearest 2400 GiB multiple. This field is not required if the storage type is <code>DYNAMIC</code> (the system ignores any value that you enter).</p>"
      }
    },
    "StartRunResponse": {
      "base": null,
      "refs": {
      }
    },
    "StartVariantImportRequest": {
      "base": null,
      "refs": {
      }
    },
    "StartVariantImportResponse": {
      "base": null,
      "refs": {
      }
    },
    "StatusList": {
      "base": null,
      "refs": {
        "Filter$status": "<p>Filter based on the resource status. You can specify up to 10 values.</p>"
      }
    },
    "StatusMessage": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$statusMessage": "<p>The store's status message.</p>",
        "AnnotationStoreVersionItem$statusMessage": "<p> The status of an annotation store version. </p>",
        "GetAnnotationStoreResponse$statusMessage": "<p>A status message.</p>",
        "GetAnnotationStoreVersionResponse$statusMessage": "<p> The status of an annotation store version. </p>",
        "GetVariantStoreResponse$statusMessage": "<p>The store's status message.</p>",
        "ShareDetails$statusMessage": "<p>The status message for a resource share. It provides additional details about the share status.</p>",
        "VariantStoreItem$statusMessage": "<p>The store's status message.</p>"
      }
    },
    "StorageType": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$storageType": "<p>The default storage type for runs that use this workflow. The <code>storageType</code> can be overridden at run time. <code>DYNAMIC</code> storage dynamically scales the storage up or down, based on file system utilization. <code>STATIC</code> storage allocates a fixed amount of storage. For more information about dynamic and static storage types, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-run-types.html\">Run storage types</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateWorkflowVersionRequest$storageType": "<p>The default storage type for runs that use this workflow version. The <code>storageType</code> can be overridden at run time. <code>DYNAMIC</code> storage dynamically scales the storage up or down, based on file system utilization. STATIC storage allocates a fixed amount of storage. For more information about dynamic and static storage types, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-run-types.html\">Run storage types</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "GetRunResponse$storageType": "<p>The run's storage type.</p>",
        "GetWorkflowResponse$storageType": "<p>The default storage type for runs using this workflow.</p>",
        "GetWorkflowVersionResponse$storageType": "<p>The default storage type for the run.</p>",
        "RunListItem$storageType": "<p>The run's storage type.</p>",
        "StartRunRequest$storageType": "<p>The storage type for the run. If you set the storage type to <code>DYNAMIC</code>, Amazon Web Services HealthOmics dynamically scales the storage up or down, based on file system utilization. By default, the run uses <code>STATIC</code> storage type, which allocates a fixed amount of storage. For more information about <code>DYNAMIC</code> and <code>STATIC</code> storage, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-run-types.html\">Run storage types</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "UpdateWorkflowRequest$storageType": "<p>The default storage type for runs that use this workflow. STATIC storage allocates a fixed amount of storage. DYNAMIC storage dynamically scales the storage up or down, based on file system utilization. For more information about static and dynamic storage, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/Using-workflows.html\">Running workflows</a> in the <i>Amazon Web Services HealthOmics User Guide</i>. </p>",
        "UpdateWorkflowVersionRequest$storageType": "<p>The default storage type for runs that use this workflow version. The <code>storageType</code> can be overridden at run time. <code>DYNAMIC</code> storage dynamically scales the storage up or down, based on file system utilization. STATIC storage allocates a fixed amount of storage. For more information about dynamic and static storage types, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflows-run-types.html\">Run storage types</a> in the <i>in the <i>Amazon Web Services HealthOmics User Guide</i> </i>.</p>"
      }
    },
    "StoreFormat": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$storeFormat": "<p>The store's file format.</p>",
        "CreateAnnotationStoreRequest$storeFormat": "<p>The annotation file format of the store.</p>",
        "CreateAnnotationStoreResponse$storeFormat": "<p>The annotation file format of the store.</p>",
        "GetAnnotationStoreResponse$storeFormat": "<p>The store's annotation file format.</p>",
        "UpdateAnnotationStoreResponse$storeFormat": "<p>The annotation file format of the store.</p>"
      }
    },
    "StoreId": {
      "base": null,
      "refs": {
        "GetS3AccessPolicyResponse$storeId": "<p>The Amazon Web Services-generated Sequence Store or Reference Store ID.</p>",
        "PutS3AccessPolicyResponse$storeId": "<p>The Amazon Web Services-generated Sequence Store or Reference Store ID.</p>"
      }
    },
    "StoreName": {
      "base": null,
      "refs": {
        "AnnotationStoreVersionItem$name": "<p> A name given to an annotation store version to distinguish it from others. </p>",
        "CreateAnnotationStoreRequest$name": "<p>A name for the store.</p>",
        "CreateAnnotationStoreVersionRequest$name": "<p> The name of an annotation store version from which versions are being created. </p>",
        "CreateAnnotationStoreVersionResponse$name": "<p> The name given to an annotation store version to distinguish it from other versions. </p>",
        "CreateVariantStoreRequest$name": "<p>A name for the store.</p>",
        "GetAnnotationImportResponse$destinationName": "<p>The job's destination annotation store.</p>",
        "GetAnnotationStoreVersionResponse$name": "<p> The name of the annotation store. </p>",
        "GetVariantImportResponse$destinationName": "<p>The job's destination variant store.</p>",
        "StartAnnotationImportRequest$destinationName": "<p>A destination annotation store for the job.</p>",
        "StartVariantImportRequest$destinationName": "<p>The destination variant store for the job.</p>",
        "UpdateAnnotationStoreVersionResponse$name": "<p> The name of an annotation store. </p>"
      }
    },
    "StoreOptions": {
      "base": "<p>Settings for a store.</p>",
      "refs": {
        "CreateAnnotationStoreRequest$storeOptions": "<p>File parsing options for the annotation store.</p>",
        "CreateAnnotationStoreResponse$storeOptions": "<p>The store's file parsing options.</p>",
        "GetAnnotationStoreResponse$storeOptions": "<p>The store's parsing options.</p>",
        "UpdateAnnotationStoreResponse$storeOptions": "<p>Parsing options for the store.</p>"
      }
    },
    "StoreStatus": {
      "base": null,
      "refs": {
        "AnnotationStoreItem$status": "<p>The store's status.</p>",
        "CreateAnnotationStoreResponse$status": "<p>The store's status.</p>",
        "CreateVariantStoreResponse$status": "<p>The store's status.</p>",
        "DeleteAnnotationStoreResponse$status": "<p>The store's status.</p>",
        "DeleteVariantStoreResponse$status": "<p>The store's status.</p>",
        "GetAnnotationStoreResponse$status": "<p>The store's status.</p>",
        "GetVariantStoreResponse$status": "<p>The store's status.</p>",
        "ListAnnotationStoresFilter$status": "<p>A status to filter on.</p>",
        "ListVariantStoresFilter$status": "<p>A status to filter on.</p>",
        "UpdateAnnotationStoreResponse$status": "<p>The store's status.</p>",
        "UpdateVariantStoreResponse$status": "<p>The store's status.</p>",
        "VariantStoreItem$status": "<p>The store's status.</p>"
      }
    },
    "StoreType": {
      "base": null,
      "refs": {
        "GetS3AccessPolicyResponse$storeType": "<p>The type of store associated with the access point.</p>",
        "PutS3AccessPolicyResponse$storeType": "<p>The type of store associated with the access point.</p>"
      }
    },
    "String": {
      "base": null,
      "refs": {
        "AcceptShareRequest$shareId": "<p>The ID of the resource share.</p>",
        "AccessDeniedException$message": null,
        "AnnotationImportJobItem$id": "<p>The job's ID.</p>",
        "AnnotationImportJobItem$destinationName": "<p>The job's destination annotation store.</p>",
        "AnnotationStoreItem$name": "<p>The store's name.</p>",
        "ArnList$member": null,
        "ConflictException$message": null,
        "CreateAnnotationStoreResponse$name": "<p>The store's name.</p>",
        "CreateShareRequest$resourceArn": "<p>The ARN of the resource to be shared.</p>",
        "CreateShareRequest$principalSubscriber": "<p>The principal subscriber is the account being offered shared access to the resource. </p>",
        "CreateShareResponse$shareId": "<p>The ID that HealthOmics generates for the share.</p>",
        "CreateVariantStoreResponse$name": "<p>The store's name.</p>",
        "DefinitionRepositoryDetails$providerType": "<p>The provider type of the source code repository, such as Bitbucket, GitHub, GitHubEnterpriseServer, GitLab, and GitLabSelfManaged.</p>",
        "DefinitionRepositoryDetails$providerEndpoint": "<p>The endpoint URL of the source code repository provider.</p>",
        "DeleteAnnotationStoreRequest$name": "<p>The store's name.</p>",
        "DeleteAnnotationStoreVersionsRequest$name": "<p> The name of the annotation store from which versions are being deleted. </p>",
        "DeleteShareRequest$shareId": "<p>The ID for the resource share to be deleted.</p>",
        "DeleteVariantStoreRequest$name": "<p>The store's name.</p>",
        "ETag$source1": "<p>The ETag hash calculated on Source1 of the read set.</p>",
        "ETag$source2": "<p>The ETag hash calculated on Source2 of the read set.</p>",
        "ExcludeFilePatternList$member": null,
        "GetAnnotationStoreRequest$name": "<p>The store's name.</p>",
        "GetAnnotationStoreResponse$name": "<p>The store's name.</p>",
        "GetAnnotationStoreVersionRequest$name": "<p> The name given to an annotation store version to distinguish it from others. </p>",
        "GetAnnotationStoreVersionRequest$versionName": "<p> The name given to an annotation store version to distinguish it from others. </p>",
        "GetShareRequest$shareId": "<p>The ID of the share.</p>",
        "GetVariantStoreRequest$name": "<p>The store's name.</p>",
        "GetVariantStoreResponse$name": "<p>The store's name.</p>",
        "InternalServerException$message": null,
        "ListAnnotationImportJobsFilter$storeName": "<p>A store name to filter on.</p>",
        "ListAnnotationImportJobsResponse$nextToken": "<p>Specifies the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListAnnotationStoreVersionsRequest$name": "<p> The name of an annotation store. </p>",
        "ListAnnotationStoreVersionsResponse$nextToken": "<p> Specifies the pagination token from a previous request to retrieve the next page of results. </p>",
        "ListAnnotationStoresResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListSharesRequest$nextToken": "<p>Next token returned in the response of a previous ListReadSetUploadPartsRequest call. Used to get the next page of results.</p>",
        "ListSharesResponse$nextToken": "<p> Next token returned in the response of a previous ListSharesResponse call. Used to get the next page of results. </p>",
        "ListVariantImportJobsFilter$storeName": "<p>A store name to filter on.</p>",
        "ListVariantImportJobsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "ListVariantStoresResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>",
        "NotSupportedOperationException$message": null,
        "RangeNotSatisfiableException$message": null,
        "ReadSetBatchError$code": "<p>The error's code.</p>",
        "ReadSetBatchError$message": "<p>The error's message.</p>",
        "ReadSetUploadPartListItem$checksum": "<p> A unique identifier used to confirm that parts are being added to the correct upload. </p>",
        "RequestTimeoutException$message": null,
        "ResourceNotFoundException$message": null,
        "SequenceInformation$alignment": "<p>The sequence's alignment setting.</p>",
        "ServiceQuotaExceededException$message": null,
        "ShareDetails$shareId": "<p>The ID of the resource share.</p>",
        "ShareDetails$resourceArn": "<p>The Arn of the shared resource. </p>",
        "ShareDetails$resourceId": "<p>The ID of the shared resource. </p>",
        "ShareDetails$principalSubscriber": "<p>The principal subscriber is the account that is sharing the resource.</p>",
        "ShareDetails$ownerId": "<p>The account ID for the data owner. The owner creates the resource share.</p>",
        "ThrottlingException$message": null,
        "UpdateAnnotationStoreRequest$name": "<p>A name for the store.</p>",
        "UpdateAnnotationStoreResponse$name": "<p>The store's name.</p>",
        "UpdateAnnotationStoreVersionRequest$name": "<p> The name of an annotation store. </p>",
        "UpdateAnnotationStoreVersionRequest$versionName": "<p> The name of an annotation store version. </p>",
        "UpdateVariantStoreRequest$name": "<p>A name for the store.</p>",
        "UpdateVariantStoreResponse$name": "<p>The store's name.</p>",
        "UploadReadSetPartResponse$checksum": "<p>An identifier used to confirm that parts are being added to the intended upload.</p>",
        "ValidationException$message": null,
        "VariantImportJobItem$id": "<p>The job's ID.</p>",
        "VariantImportJobItem$destinationName": "<p>The job's destination variant store.</p>",
        "VariantStoreItem$name": "<p>The store's name.</p>",
        "VersionDeleteError$message": "<p> The message explaining the error in annotation store deletion. </p>"
      }
    },
    "SubjectId": {
      "base": null,
      "refs": {
        "CreateMultipartReadSetUploadRequest$subjectId": "<p>The source's subject ID.</p>",
        "CreateMultipartReadSetUploadResponse$subjectId": "<p>The source's subject ID.</p>",
        "GetReadSetMetadataResponse$subjectId": "<p>The read set's subject ID.</p>",
        "ImportReadSetSourceItem$subjectId": "<p>The source's subject ID.</p>",
        "MultipartReadSetUploadListItem$subjectId": "<p> The read set source's subject ID. </p>",
        "ReadSetFilter$subjectId": "<p> The read set source's subject ID. </p>",
        "ReadSetListItem$subjectId": "<p>The read set's subject ID.</p>",
        "StartReadSetImportJobSourceItem$subjectId": "<p>The source's subject ID.</p>"
      }
    },
    "SyntheticTimestamp_date_time": {
      "base": null,
      "refs": {
        "ActivateReadSetFilter$createdAfter": "<p>The filter's start date.</p>",
        "ActivateReadSetFilter$createdBefore": "<p>The filter's end date.</p>",
        "ActivateReadSetJobItem$creationTime": "<p>When the job was created.</p>",
        "ActivateReadSetJobItem$completionTime": "<p>When the job completed.</p>",
        "CreateMultipartReadSetUploadResponse$creationTime": "<p>The creation time of the multipart upload.</p>",
        "CreateReferenceStoreResponse$creationTime": "<p>When the store was created.</p>",
        "CreateSequenceStoreResponse$creationTime": "<p>When the store was created.</p>",
        "ExportReadSetFilter$createdAfter": "<p>The filter's start date.</p>",
        "ExportReadSetFilter$createdBefore": "<p>The filter's end date.</p>",
        "ExportReadSetJobDetail$creationTime": "<p>When the job was created.</p>",
        "ExportReadSetJobDetail$completionTime": "<p>When the job completed.</p>",
        "GetReadSetActivationJobResponse$creationTime": "<p>When the job was created.</p>",
        "GetReadSetActivationJobResponse$completionTime": "<p>When the job completed.</p>",
        "GetReadSetExportJobResponse$creationTime": "<p>When the job was created.</p>",
        "GetReadSetExportJobResponse$completionTime": "<p>When the job completed.</p>",
        "GetReadSetImportJobResponse$creationTime": "<p>When the job was created.</p>",
        "GetReadSetImportJobResponse$completionTime": "<p>When the job completed.</p>",
        "GetReadSetMetadataResponse$creationTime": "<p>When the read set was created.</p>",
        "GetReferenceImportJobResponse$creationTime": "<p>When the job was created.</p>",
        "GetReferenceImportJobResponse$completionTime": "<p>When the job completed.</p>",
        "GetReferenceMetadataResponse$creationTime": "<p>When the reference was created.</p>",
        "GetReferenceMetadataResponse$updateTime": "<p>When the reference was updated.</p>",
        "GetReferenceStoreResponse$creationTime": "<p>When the store was created.</p>",
        "GetS3AccessPolicyResponse$updateTime": "<p>The time when the policy was last updated.</p>",
        "GetSequenceStoreResponse$creationTime": "<p>When the store was created.</p>",
        "GetSequenceStoreResponse$updateTime": "<p>The last-updated time of the sequence store.</p>",
        "ImportReadSetFilter$createdAfter": "<p>The filter's start date.</p>",
        "ImportReadSetFilter$createdBefore": "<p>The filter's end date.</p>",
        "ImportReadSetJobItem$creationTime": "<p>When the job was created.</p>",
        "ImportReadSetJobItem$completionTime": "<p>When the job completed.</p>",
        "ImportReferenceFilter$createdAfter": "<p>The filter's start date.</p>",
        "ImportReferenceFilter$createdBefore": "<p>The filter's end date.</p>",
        "ImportReferenceJobItem$creationTime": "<p>When the job was created.</p>",
        "ImportReferenceJobItem$completionTime": "<p>When the job completed.</p>",
        "MultipartReadSetUploadListItem$creationTime": "<p> The time stamp for when a direct upload was created. </p>",
        "ReadSetFilter$createdAfter": "<p>The filter's start date.</p>",
        "ReadSetFilter$createdBefore": "<p>The filter's end date.</p>",
        "ReadSetListItem$creationTime": "<p>When the read set was created.</p>",
        "ReadSetUploadPartListFilter$createdAfter": "<p> Filters for read set uploads after a specified time. </p>",
        "ReadSetUploadPartListFilter$createdBefore": "<p> Filters for read set part uploads before a specified time. </p>",
        "ReadSetUploadPartListItem$creationTime": "<p> The time stamp for when a direct upload was created. </p>",
        "ReadSetUploadPartListItem$lastUpdatedTime": "<p> The time stamp for the most recent update to an uploaded part. </p>",
        "ReferenceFilter$createdAfter": "<p>The filter's start date.</p>",
        "ReferenceFilter$createdBefore": "<p>The filter's end date.</p>",
        "ReferenceListItem$creationTime": "<p>When the reference was created.</p>",
        "ReferenceListItem$updateTime": "<p>When the reference was updated.</p>",
        "ReferenceStoreDetail$creationTime": "<p>When the store was created.</p>",
        "ReferenceStoreFilter$createdAfter": "<p>The filter's start date.</p>",
        "ReferenceStoreFilter$createdBefore": "<p>The filter's end date.</p>",
        "SequenceStoreDetail$creationTime": "<p>When the store was created.</p>",
        "SequenceStoreDetail$updateTime": "<p>The last-updated time of the Sequence Store.</p>",
        "SequenceStoreFilter$createdAfter": "<p>The filter's start date.</p>",
        "SequenceStoreFilter$createdBefore": "<p>The filter's end date.</p>",
        "SequenceStoreFilter$updatedAfter": "<p>Filter results based on stores updated after the specified time.</p>",
        "SequenceStoreFilter$updatedBefore": "<p>Filter results based on stores updated before the specified time.</p>",
        "StartReadSetActivationJobResponse$creationTime": "<p>When the job was created.</p>",
        "StartReadSetExportJobResponse$creationTime": "<p>When the job was created.</p>",
        "StartReadSetImportJobResponse$creationTime": "<p>When the job was created.</p>",
        "StartReferenceImportJobResponse$creationTime": "<p>When the job was created.</p>",
        "UpdateSequenceStoreResponse$creationTime": "<p>The time when the store was created.</p>",
        "UpdateSequenceStoreResponse$updateTime": "<p>The last-updated time of the Sequence Store.</p>"
      }
    },
    "TagArn": {
      "base": null,
      "refs": {
        "ListTagsForResourceRequest$resourceArn": "<p>The resource's ARN.</p>",
        "TagResourceRequest$resourceArn": "<p>The resource's ARN.</p>",
        "UntagResourceRequest$resourceArn": "<p>The resource's ARN.</p>"
      }
    },
    "TagKey": {
      "base": null,
      "refs": {
        "PropagatedSetLevelTags$member": null,
        "TagKeyList$member": null,
        "TagMap$key": null,
        "TagResourceRequestTagsMap$key": null
      }
    },
    "TagKeyList": {
      "base": null,
      "refs": {
        "UntagResourceRequest$tagKeys": "<p>Keys of tags to remove.</p>"
      }
    },
    "TagMap": {
      "base": null,
      "refs": {
        "CreateAnnotationStoreRequest$tags": "<p>Tags for the store.</p>",
        "CreateAnnotationStoreVersionRequest$tags": "<p> Any tags added to annotation store version. </p>",
        "CreateMultipartReadSetUploadRequest$tags": "<p>Any tags to add to the read set.</p>",
        "CreateMultipartReadSetUploadResponse$tags": "<p>The tags to add to the read set.</p>",
        "CreateReferenceStoreRequest$tags": "<p>Tags for the store.</p>",
        "CreateRunCacheRequest$tags": "<p>Specify one or more tags to associate with this run cache.</p>",
        "CreateRunCacheResponse$tags": "<p>The tags associated with this run cache.</p>",
        "CreateRunGroupRequest$tags": "<p>Tags for the group.</p>",
        "CreateRunGroupResponse$tags": "<p>Tags for the run group.</p>",
        "CreateSequenceStoreRequest$tags": "<p>Tags for the store. You can configure up to 50 tags.</p>",
        "CreateVariantStoreRequest$tags": "<p>Tags for the store.</p>",
        "CreateWorkflowRequest$tags": "<p>Tags for the workflow. You can define up to 50 tags for the workflow. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/add-a-tag.html\">Adding a tag</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateWorkflowResponse$tags": "<p>The workflow's tags.</p>",
        "CreateWorkflowVersionRequest$tags": "<p>Tags for this workflow version. You can define up to 50 tags for the workflow. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/add-a-tag.html\">Adding a tag</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateWorkflowVersionResponse$tags": "<p>The workflow version's tags.</p>",
        "GetAnnotationStoreResponse$tags": "<p>The store's tags.</p>",
        "GetAnnotationStoreVersionResponse$tags": "<p> Any tags associated with an annotation store version. </p>",
        "GetRunCacheResponse$tags": "<p>The tags associated with the run cache.</p>",
        "GetRunGroupResponse$tags": "<p>The group's tags.</p>",
        "GetRunResponse$tags": "<p>The run's tags.</p>",
        "GetVariantStoreResponse$tags": "<p>The store's tags.</p>",
        "GetWorkflowResponse$tags": "<p>The workflow's tags.</p>",
        "GetWorkflowVersionResponse$tags": "<p>The workflow version tags</p>",
        "ImportReadSetSourceItem$tags": "<p>The source's tags.</p>",
        "ImportReferenceSourceItem$tags": "<p>The source's tags.</p>",
        "ListTagsForResourceResponse$tags": "<p>A list of tags.</p>",
        "MultipartReadSetUploadListItem$tags": "<p> Any tags you wish to add to a read set. </p>",
        "StartReadSetImportJobSourceItem$tags": "<p>The source's tags.</p>",
        "StartReferenceImportJobSourceItem$tags": "<p>The source's tags.</p>",
        "StartRunRequest$tags": "<p>Tags for the run. You can add up to 50 tags per run. For more information, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/add-a-tag.html\">Adding a tag</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "StartRunResponse$tags": "<p>The run's tags.</p>"
      }
    },
    "TagResourceRequest": {
      "base": null,
      "refs": {
      }
    },
    "TagResourceRequestTagsMap": {
      "base": null,
      "refs": {
        "TagResourceRequest$tags": "<p>Tags for the resource.</p>"
      }
    },
    "TagResourceResponse": {
      "base": null,
      "refs": {
      }
    },
    "TagValue": {
      "base": null,
      "refs": {
        "TagMap$value": null,
        "TagResourceRequestTagsMap$value": null
      }
    },
    "TaskFailureReason": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$failureReason": "<p>The reason a task has failed.</p>"
      }
    },
    "TaskId": {
      "base": null,
      "refs": {
        "GetRunTaskRequest$taskId": "<p>The task's ID.</p>",
        "GetRunTaskResponse$taskId": "<p>The task's ID.</p>",
        "TaskListItem$taskId": "<p>The task's ID.</p>"
      }
    },
    "TaskImageDigest": {
      "base": null,
      "refs": {
        "ImageDetails$imageDigest": "<p>The container image digest. If the image URI was transformed, this will be the digest of the container image referenced by the transformed URI.</p>"
      }
    },
    "TaskInstanceType": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$instanceType": "<p>The instance type for a task.</p>",
        "TaskListItem$instanceType": "<p> The instance type for a task.</p>"
      }
    },
    "TaskList": {
      "base": null,
      "refs": {
        "ListRunTasksResponse$items": "<p>A list of tasks.</p>"
      }
    },
    "TaskListItem": {
      "base": "<p>A workflow run task.</p>",
      "refs": {
        "TaskList$member": null
      }
    },
    "TaskListItemCpusInteger": {
      "base": null,
      "refs": {
        "TaskListItem$cpus": "<p>The task's CPU count.</p>"
      }
    },
    "TaskListItemGpusInteger": {
      "base": null,
      "refs": {
        "TaskListItem$gpus": "<p> The number of Graphics Processing Units (GPU) specified for the task. </p>"
      }
    },
    "TaskListItemMemoryInteger": {
      "base": null,
      "refs": {
        "TaskListItem$memory": "<p>The task's memory use in gigabyes.</p>"
      }
    },
    "TaskListToken": {
      "base": null,
      "refs": {
        "ListRunTasksRequest$startingToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListRunTasksResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "TaskLogStream": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$logStream": "<p>The task's log stream.</p>"
      }
    },
    "TaskName": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$name": "<p>The task's name.</p>",
        "TaskListItem$name": "<p>The task's name.</p>"
      }
    },
    "TaskStatus": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$status": "<p>The task's status.</p>",
        "ListRunTasksRequest$status": "<p>Filter the list by status.</p>",
        "TaskListItem$status": "<p>The task's status.</p>"
      }
    },
    "TaskStatusMessage": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$statusMessage": "<p>The task's status message.</p>"
      }
    },
    "TaskTimestamp": {
      "base": null,
      "refs": {
        "GetRunTaskResponse$creationTime": "<p>When the task was created.</p>",
        "GetRunTaskResponse$startTime": "<p>The task's start time.</p>",
        "GetRunTaskResponse$stopTime": "<p>The task's stop time.</p>",
        "TaskListItem$creationTime": "<p>When the task was created.</p>",
        "TaskListItem$startTime": "<p>When the task started.</p>",
        "TaskListItem$stopTime": "<p>When the task stopped.</p>"
      }
    },
    "ThrottlingException": {
      "base": "<p>The request was denied due to request throttling.</p>",
      "refs": {
      }
    },
    "TsvOptions": {
      "base": "<p>Formatting options for a TSV file.</p>",
      "refs": {
        "FormatOptions$tsvOptions": "<p>Options for a TSV file.</p>"
      }
    },
    "TsvStoreOptions": {
      "base": "<p>File settings for a TSV store.</p>",
      "refs": {
        "StoreOptions$tsvStoreOptions": "<p>File settings for a TSV store.</p>"
      }
    },
    "TsvStoreOptionsSchemaList": {
      "base": null,
      "refs": {
        "TsvStoreOptions$schema": "<p>The store's schema.</p>"
      }
    },
    "TsvVersionOptions": {
      "base": "<p> The options for a TSV file. </p>",
      "refs": {
        "VersionOptions$tsvVersionOptions": "<p> File settings for a version of a TSV store. </p>"
      }
    },
    "TsvVersionOptionsSchemaList": {
      "base": null,
      "refs": {
        "TsvVersionOptions$schema": "<p> The TSV schema for an annotation store version. </p>"
      }
    },
    "TypeList": {
      "base": null,
      "refs": {
        "Filter$type": "<p>The type of resources to be filtered. You can specify one or more of the resource types.</p>"
      }
    },
    "UntagResourceRequest": {
      "base": null,
      "refs": {
      }
    },
    "UntagResourceResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpdateAnnotationStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateAnnotationStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpdateAnnotationStoreVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateAnnotationStoreVersionResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpdateRunCacheRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateRunGroupRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateRunGroupRequestMaxCpusInteger": {
      "base": null,
      "refs": {
        "UpdateRunGroupRequest$maxCpus": "<p>The maximum number of CPUs to use.</p>"
      }
    },
    "UpdateRunGroupRequestMaxDurationInteger": {
      "base": null,
      "refs": {
        "UpdateRunGroupRequest$maxDuration": "<p>A maximum run time for the group in minutes.</p>"
      }
    },
    "UpdateRunGroupRequestMaxGpusInteger": {
      "base": null,
      "refs": {
        "UpdateRunGroupRequest$maxGpus": "<p>The maximum GPUs that can be used by a run group.</p>"
      }
    },
    "UpdateRunGroupRequestMaxRunsInteger": {
      "base": null,
      "refs": {
        "UpdateRunGroupRequest$maxRuns": "<p>The maximum number of concurrent runs for the group.</p>"
      }
    },
    "UpdateSequenceStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateSequenceStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpdateTime": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$updateTime": "<p>When the job was updated.</p>",
        "AnnotationStoreItem$updateTime": "<p>When the store was updated.</p>",
        "AnnotationStoreVersionItem$updateTime": "<p> The time stamp for when an annotation store version was updated. </p>",
        "GetAnnotationImportResponse$updateTime": "<p>When the job was updated.</p>",
        "GetAnnotationStoreResponse$updateTime": "<p>When the store was updated.</p>",
        "GetAnnotationStoreVersionResponse$updateTime": "<p> The time stamp for when an annotation store version was updated. </p>",
        "GetVariantImportResponse$updateTime": "<p>When the job was updated.</p>",
        "GetVariantStoreResponse$updateTime": "<p>When the store was updated.</p>",
        "ShareDetails$updateTime": "<p>The timestamp of the resource share update.</p>",
        "UpdateAnnotationStoreResponse$updateTime": "<p>When the store was updated.</p>",
        "UpdateAnnotationStoreVersionResponse$updateTime": "<p> The time stamp for when an annotation store version was updated. </p>",
        "UpdateVariantStoreResponse$updateTime": "<p>When the store was updated.</p>",
        "VariantImportJobItem$updateTime": "<p>When the job was updated.</p>",
        "VariantStoreItem$updateTime": "<p>When the store was updated.</p>"
      }
    },
    "UpdateVariantStoreRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateVariantStoreResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpdateWorkflowRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateWorkflowRequestStorageCapacityInteger": {
      "base": null,
      "refs": {
        "UpdateWorkflowRequest$storageCapacity": "<p>The default static storage capacity (in gibibytes) for runs that use this workflow or workflow version. </p>"
      }
    },
    "UpdateWorkflowVersionRequest": {
      "base": null,
      "refs": {
      }
    },
    "UpdateWorkflowVersionRequestStorageCapacityInteger": {
      "base": null,
      "refs": {
        "UpdateWorkflowVersionRequest$storageCapacity": "<p>The default static storage capacity (in gibibytes) for runs that use this workflow version. The <code>storageCapacity</code> can be overwritten at run time. The storage capacity is not required for runs with a <code>DYNAMIC</code> storage type.</p>"
      }
    },
    "UploadId": {
      "base": null,
      "refs": {
        "AbortMultipartReadSetUploadRequest$uploadId": "<p>The ID for the multipart upload.</p>",
        "CompleteMultipartReadSetUploadRequest$uploadId": "<p>The ID for the multipart upload.</p>",
        "CreateMultipartReadSetUploadResponse$uploadId": "<p>The ID for the initiated multipart upload.</p>",
        "ListReadSetUploadPartsRequest$uploadId": "<p>The ID for the initiated multipart upload.</p>",
        "MultipartReadSetUploadListItem$uploadId": "<p> The ID for the initiated multipart upload. </p>",
        "UploadReadSetPartRequest$uploadId": "<p>The ID for the initiated multipart upload.</p>"
      }
    },
    "UploadReadSetPartRequest": {
      "base": null,
      "refs": {
      }
    },
    "UploadReadSetPartRequestPartNumberInteger": {
      "base": null,
      "refs": {
        "UploadReadSetPartRequest$partNumber": "<p>The number of the part being uploaded.</p>"
      }
    },
    "UploadReadSetPartResponse": {
      "base": null,
      "refs": {
      }
    },
    "UpstreamRepositoryPrefix": {
      "base": null,
      "refs": {
        "RegistryMapping$upstreamRepositoryPrefix": "<p>The repository prefix of the corresponding repository in the upstream registry.</p>"
      }
    },
    "Uri": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$containerRegistryMapUri": "<p>(Optional) URI of the S3 location for the registry mapping file.</p>",
        "CreateWorkflowVersionRequest$containerRegistryMapUri": "<p>(Optional) URI of the S3 location for the registry mapping file.</p>",
        "ImageDetails$image": "<p>The URI of the container image.</p>",
        "ImageDetails$sourceImage": "<p>URI of the source registry. If the URI is from a third-party registry, Amazon Web Services HealthOmics transforms the URI to the corresponding ECR path, using the pull-through cache mapping rules.</p>",
        "ImageMapping$sourceImage": "<p>Specifies the URI of the source image in the upstream registry.</p>",
        "ImageMapping$destinationImage": "<p>Specifies the URI of the corresponding image in the private ECR registry.</p>",
        "RegistryMapping$upstreamRegistryUrl": "<p>The URI of the upstream registry.</p>"
      }
    },
    "UserCustomDescription": {
      "base": null,
      "refs": {
        "CreateRunCacheRequest$description": "<p>Enter a description of the run cache.</p>",
        "GetRunCacheResponse$description": "<p>The run cache description.</p>",
        "UpdateRunCacheRequest$description": "<p>Update the run cache description.</p>"
      }
    },
    "UserCustomName": {
      "base": null,
      "refs": {
        "CreateRunCacheRequest$name": "<p>Enter a user-friendly name for the run cache.</p>",
        "GetRunCacheResponse$name": "<p>The run cache name.</p>",
        "RunCacheListItem$name": "<p>The name of the run cache.</p>",
        "UpdateRunCacheRequest$name": "<p>Update the name of the run cache.</p>"
      }
    },
    "ValidationException": {
      "base": "<p>The input fails to satisfy the constraints specified by an AWS service.</p>",
      "refs": {
      }
    },
    "VariantImportItemDetail": {
      "base": "<p>Details about an imported variant item.</p>",
      "refs": {
        "VariantImportItemDetails$member": null
      }
    },
    "VariantImportItemDetails": {
      "base": null,
      "refs": {
        "GetVariantImportResponse$items": "<p>The job's items.</p>"
      }
    },
    "VariantImportItemSource": {
      "base": "<p>A imported variant item's source.</p>",
      "refs": {
        "VariantImportItemSources$member": null
      }
    },
    "VariantImportItemSources": {
      "base": null,
      "refs": {
        "StartVariantImportRequest$items": "<p>Items to import.</p>"
      }
    },
    "VariantImportJobItem": {
      "base": "<p>A variant import job.</p>",
      "refs": {
        "VariantImportJobItems$member": null
      }
    },
    "VariantImportJobItems": {
      "base": null,
      "refs": {
        "ListVariantImportJobsResponse$variantImportJobs": "<p>A list of jobs.</p>"
      }
    },
    "VariantStoreItem": {
      "base": "<p>A variant store.</p>",
      "refs": {
        "VariantStoreItems$member": null
      }
    },
    "VariantStoreItems": {
      "base": null,
      "refs": {
        "ListVariantStoresResponse$variantStores": "<p>A list of variant stores.</p>"
      }
    },
    "VcfOptions": {
      "base": "<p>Formatting options for a VCF file.</p>",
      "refs": {
        "FormatOptions$vcfOptions": "<p>Options for a VCF file.</p>"
      }
    },
    "VersionDeleteError": {
      "base": "<p> The error preventing deletion of the annotation store version. </p>",
      "refs": {
        "VersionDeleteErrorList$member": null
      }
    },
    "VersionDeleteErrorList": {
      "base": null,
      "refs": {
        "DeleteAnnotationStoreVersionsResponse$errors": "<p> Any errors that occur when attempting to delete an annotation store version. </p>"
      }
    },
    "VersionList": {
      "base": null,
      "refs": {
        "DeleteAnnotationStoreVersionsRequest$versions": "<p> The versions of an annotation store to be deleted. </p>"
      }
    },
    "VersionName": {
      "base": null,
      "refs": {
        "AnnotationImportJobItem$versionName": "<p> The name of the annotation store version. </p>",
        "AnnotationStoreVersionItem$versionName": "<p> The name of an annotation store version. </p>",
        "CreateAnnotationStoreRequest$versionName": "<p> The name given to an annotation store version to distinguish it from other versions. </p>",
        "CreateAnnotationStoreResponse$versionName": "<p> The name given to an annotation store version to distinguish it from other versions. </p>",
        "CreateAnnotationStoreVersionRequest$versionName": "<p> The name given to an annotation store version to distinguish it from other versions. </p>",
        "CreateAnnotationStoreVersionResponse$versionName": "<p> The name given to an annotation store version to distinguish it from other versions. </p>",
        "GetAnnotationImportResponse$versionName": "<p> The name of the annotation store version. </p>",
        "GetAnnotationStoreVersionResponse$versionName": "<p> The name given to an annotation store version to distinguish it from others. </p>",
        "StartAnnotationImportRequest$versionName": "<p> The name of the annotation store version. </p>",
        "UpdateAnnotationStoreVersionResponse$versionName": "<p> The name of an annotation store version. </p>",
        "VersionDeleteError$versionName": "<p> The name given to an annotation store version. </p>",
        "VersionList$member": null
      }
    },
    "VersionOptions": {
      "base": "<p> The options for an annotation store version. </p>",
      "refs": {
        "CreateAnnotationStoreVersionRequest$versionOptions": "<p> The options for an annotation store version. </p>",
        "CreateAnnotationStoreVersionResponse$versionOptions": "<p> The options for an annotation store version. </p>",
        "GetAnnotationStoreVersionResponse$versionOptions": "<p> The options for an annotation store version. </p>"
      }
    },
    "VersionStatus": {
      "base": null,
      "refs": {
        "AnnotationStoreVersionItem$status": "<p> The status of an annotation store version. </p>",
        "CreateAnnotationStoreVersionResponse$status": "<p> The status of a annotation store version. </p>",
        "GetAnnotationStoreVersionResponse$status": "<p> The status of an annotation store version. </p>",
        "ListAnnotationStoreVersionsFilter$status": "<p>The status of an annotation store version.</p>",
        "UpdateAnnotationStoreVersionResponse$status": "<p> The status of an annotation store version. </p>"
      }
    },
    "WorkflowArn": {
      "base": null,
      "refs": {
        "CreateWorkflowResponse$arn": "<p>The workflow's ARN.</p>",
        "GetWorkflowResponse$arn": "<p>The workflow's ARN.</p>",
        "WorkflowListItem$arn": "<p>The workflow's ARN.</p>"
      }
    },
    "WorkflowBucketOwnerId": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$workflowBucketOwnerId": "<p>The Amazon Web Services account ID of the expected owner of the S3 bucket that contains the workflow definition. If not specified, the service skips the validation.</p>",
        "CreateWorkflowVersionRequest$workflowBucketOwnerId": "<p>Amazon Web Services Id of the owner of the S3 bucket that contains the workflow definition. You need to specify this parameter if your account is not the bucket owner.</p>",
        "GetWorkflowVersionResponse$workflowBucketOwnerId": "<p>Amazon Web Services Id of the owner of the bucket.</p>"
      }
    },
    "WorkflowDefinition": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$definitionUri": "<p>The S3 URI of a definition for the workflow. The S3 bucket must be in the same region as the workflow.</p>",
        "CreateWorkflowVersionRequest$definitionUri": "<p>The S3 URI of a definition for this workflow version. The S3 bucket must be in the same region as this workflow version.</p>",
        "GetRunResponse$definition": "<p>The run's definition.</p>",
        "GetWorkflowResponse$definition": "<p>The workflow's definition.</p>",
        "GetWorkflowVersionResponse$definition": "<p>Definition of the workflow version.</p>"
      }
    },
    "WorkflowDescription": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$description": "<p>A description for the workflow.</p>",
        "GetWorkflowResponse$description": "<p>The workflow's description.</p>",
        "UpdateWorkflowRequest$description": "<p>A description for the workflow.</p>"
      }
    },
    "WorkflowDigest": {
      "base": null,
      "refs": {
        "GetRunResponse$digest": "<p>The run's digest.</p>",
        "GetWorkflowResponse$digest": "<p>The workflow's digest.</p>",
        "GetWorkflowVersionResponse$digest": "<p>The workflow version's digest.</p>",
        "WorkflowListItem$digest": "<p>The workflow's digest.</p>",
        "WorkflowVersionListItem$digest": "<p>The digist of the workflow version.</p>"
      }
    },
    "WorkflowEngine": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$engine": "<p>The workflow engine for the workflow. This is only required if you have workflow definition files from more than one engine in your zip file. Otherwise, the service can detect the engine automatically from your workflow definition.</p>",
        "CreateWorkflowVersionRequest$engine": "<p>The workflow engine for this workflow version. This is only required if you have workflow definition files from more than one engine in your zip file. Otherwise, the service can detect the engine automatically from your workflow definition.</p>",
        "GetWorkflowResponse$engine": "<p>The workflow's engine.</p>",
        "GetWorkflowVersionResponse$engine": "<p>The workflow engine for this workflow version.</p>"
      }
    },
    "WorkflowExport": {
      "base": null,
      "refs": {
        "WorkflowExportList$member": null
      }
    },
    "WorkflowExportList": {
      "base": null,
      "refs": {
        "GetWorkflowRequest$export": "<p>The export format for the workflow.</p>",
        "GetWorkflowVersionRequest$export": "<p>The export format for the workflow.</p>"
      }
    },
    "WorkflowId": {
      "base": null,
      "refs": {
        "CreateWorkflowResponse$id": "<p>The workflow's ID.</p>",
        "CreateWorkflowVersionRequest$workflowId": "<p>The ID of the workflow where you are creating the new version. The <code>workflowId</code> is not the UUID.</p>",
        "CreateWorkflowVersionResponse$workflowId": "<p>The workflow's ID.</p>",
        "DeleteWorkflowRequest$id": "<p>The workflow's ID.</p>",
        "DeleteWorkflowVersionRequest$workflowId": "<p>The workflow's ID.</p>",
        "GetRunResponse$workflowId": "<p>The run's workflow ID.</p>",
        "GetWorkflowRequest$id": "<p>The workflow's ID.</p>",
        "GetWorkflowResponse$id": "<p>The workflow's ID.</p>",
        "GetWorkflowVersionRequest$workflowId": "<p>The workflow's ID. The <code>workflowId</code> is not the UUID.</p>",
        "GetWorkflowVersionResponse$workflowId": "<p>The workflow's ID.</p>",
        "ListWorkflowVersionsRequest$workflowId": "<p>The workflow's ID. The <code>workflowId</code> is not the UUID.</p>",
        "RunListItem$workflowId": "<p>The run's workflow ID.</p>",
        "StartRunRequest$workflowId": "<p>The run's workflow ID. The <code>workflowId</code> is not the UUID.</p>",
        "UpdateWorkflowRequest$id": "<p>The workflow's ID.</p>",
        "UpdateWorkflowVersionRequest$workflowId": "<p>The workflow's ID. The <code>workflowId</code> is not the UUID.</p>",
        "WorkflowListItem$id": "<p>The workflow's ID.</p>",
        "WorkflowVersionListItem$workflowId": "<p>The workflow's ID.</p>"
      }
    },
    "WorkflowList": {
      "base": null,
      "refs": {
        "ListWorkflowsResponse$items": "<p>A list of workflow items.</p>"
      }
    },
    "WorkflowListItem": {
      "base": "<p>A workflow.</p>",
      "refs": {
        "WorkflowList$member": null
      }
    },
    "WorkflowListToken": {
      "base": null,
      "refs": {
        "ListWorkflowsRequest$startingToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListWorkflowsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "WorkflowMain": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$main": "<p>The path of the main definition file for the workflow. This parameter is not required if the ZIP archive contains only one workflow definition file, or if the main definition file is named “main”. An example path is: <code>workflow-definition/main-file.wdl</code>. </p>",
        "CreateWorkflowVersionRequest$main": "<p>The path of the main definition file for this workflow version. This parameter is not required if the ZIP archive contains only one workflow definition file, or if the main definition file is named “main”. An example path is: <code>workflow-definition/main-file.wdl</code>. </p>",
        "GetWorkflowResponse$main": "<p>The path of the main definition file for the workflow.</p>",
        "GetWorkflowVersionResponse$main": "<p>The path of the main definition file for the workflow.</p>"
      }
    },
    "WorkflowMetadata": {
      "base": null,
      "refs": {
        "GetWorkflowResponse$metadata": "<p>Gets metadata for the workflow.</p>",
        "GetWorkflowVersionResponse$metadata": "<p>The metadata for the workflow version.</p>",
        "WorkflowListItem$metadata": "<p> Any metadata available for workflow. The information listed may vary depending on the workflow, and there may also be no metadata to return. </p>",
        "WorkflowVersionListItem$metadata": "<p>Metadata for the workflow version.</p>"
      }
    },
    "WorkflowMetadataKey": {
      "base": null,
      "refs": {
        "WorkflowMetadata$key": null
      }
    },
    "WorkflowMetadataValue": {
      "base": null,
      "refs": {
        "WorkflowMetadata$value": null
      }
    },
    "WorkflowName": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$name": "<p>Name (optional but highly recommended) for the workflow to locate relevant information in the CloudWatch logs and Amazon Web Services HealthOmics console. </p>",
        "GetWorkflowResponse$name": "<p>The workflow's name.</p>",
        "ListWorkflowsRequest$name": "<p>Filter the list by workflow name.</p>",
        "UpdateWorkflowRequest$name": "<p>A name for the workflow.</p>",
        "WorkflowListItem$name": "<p>The workflow's name.</p>"
      }
    },
    "WorkflowOwnerId": {
      "base": null,
      "refs": {
        "GetRunResponse$workflowOwnerId": "<p>The ID of the workflow owner.</p>",
        "GetWorkflowRequest$workflowOwnerId": "<p>The ID of the workflow owner.</p>",
        "GetWorkflowVersionRequest$workflowOwnerId": "<p>The 12-digit account ID of the workflow owner. The workflow owner ID can be retrieved using the <code>GetShare</code> API operation. If you are the workflow owner, you do not need to include this ID.</p>",
        "ListWorkflowVersionsRequest$workflowOwnerId": "<p>The 12-digit account ID of the workflow owner. The workflow owner ID can be retrieved using the <code>GetShare</code> API operation. If you are the workflow owner, you do not need to include this ID.</p>",
        "StartRunRequest$workflowOwnerId": "<p>The 12-digit account ID of the workflow owner that is used for running a shared workflow. The workflow owner ID can be retrieved using the <code>GetShare</code> API operation. If you are the workflow owner, you do not need to include this ID.</p>"
      }
    },
    "WorkflowParameter": {
      "base": "<p>A workflow parameter.</p>",
      "refs": {
        "WorkflowParameterTemplate$value": null
      }
    },
    "WorkflowParameterDescription": {
      "base": null,
      "refs": {
        "WorkflowParameter$description": "<p>The parameter's description.</p>"
      }
    },
    "WorkflowParameterName": {
      "base": null,
      "refs": {
        "WorkflowParameterTemplate$key": null
      }
    },
    "WorkflowParameterTemplate": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$parameterTemplate": "<p>A parameter template for the workflow. If this field is blank, Amazon Web Services HealthOmics will automatically parse the parameter template values from your workflow definition file. To override these service generated default values, provide a parameter template. To view an example of a parameter template, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/parameter-templates.html\">Parameter template files</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "CreateWorkflowVersionRequest$parameterTemplate": "<p>A parameter template for this workflow version. If this field is blank, Amazon Web Services HealthOmics will automatically parse the parameter template values from your workflow definition file. To override these service generated default values, provide a parameter template. To view an example of a parameter template, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/parameter-templates.html\">Parameter template files</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "GetWorkflowResponse$parameterTemplate": "<p>The workflow's parameter template.</p>",
        "GetWorkflowVersionResponse$parameterTemplate": "<p>The parameter template for the workflow version.</p>"
      }
    },
    "WorkflowRequestId": {
      "base": null,
      "refs": {
        "CreateWorkflowRequest$requestId": "<p>An idempotency token to ensure that duplicate workflows are not created when Amazon Web Services HealthOmics submits retry requests.</p>",
        "CreateWorkflowVersionRequest$requestId": "<p>An idempotency token to ensure that duplicate workflows are not created when Amazon Web Services HealthOmics submits retry requests.</p>"
      }
    },
    "WorkflowStatus": {
      "base": null,
      "refs": {
        "CreateWorkflowResponse$status": "<p>The workflow's status.</p>",
        "CreateWorkflowVersionResponse$status": "<p>The workflow version status.</p>",
        "GetWorkflowResponse$status": "<p>The workflow's status.</p>",
        "GetWorkflowVersionResponse$status": "<p>The workflow version status</p>",
        "WorkflowListItem$status": "<p>The workflow's status.</p>",
        "WorkflowVersionListItem$status": "<p>The status of the workflow version.</p>"
      }
    },
    "WorkflowStatusMessage": {
      "base": null,
      "refs": {
        "GetWorkflowResponse$statusMessage": "<p>The workflow's status message.</p>",
        "GetWorkflowVersionResponse$statusMessage": "<p>The workflow version status message</p>"
      }
    },
    "WorkflowTimestamp": {
      "base": null,
      "refs": {
        "GetWorkflowResponse$creationTime": "<p>When the workflow was created.</p>",
        "GetWorkflowVersionResponse$creationTime": "<p>When the workflow version was created.</p>",
        "WorkflowListItem$creationTime": "<p>When the workflow was created.</p>",
        "WorkflowVersionListItem$creationTime": "<p>The creation time of the workflow version.</p>"
      }
    },
    "WorkflowType": {
      "base": null,
      "refs": {
        "GetRunResponse$workflowType": "<p>The run's workflow type.</p>",
        "GetWorkflowRequest$type": "<p>The workflow's type.</p>",
        "GetWorkflowResponse$type": "<p>The workflow's type.</p>",
        "GetWorkflowVersionRequest$type": "<p>The workflow's type. </p>",
        "GetWorkflowVersionResponse$type": "<p>The workflow version type</p>",
        "ListWorkflowVersionsRequest$type": "<p>The workflow type.</p>",
        "ListWorkflowsRequest$type": "<p>Filter the list by workflow type.</p>",
        "StartRunRequest$workflowType": "<p>The run's workflow type. The <code>workflowType</code> must be specified if you are running a <code>READY2RUN</code> workflow. If you are running a <code>PRIVATE</code> workflow (default), you do not need to include the workflow type. </p>",
        "WorkflowListItem$type": "<p>The workflow's type.</p>",
        "WorkflowVersionListItem$type": "<p>The type of the workflow version.</p>"
      }
    },
    "WorkflowUuid": {
      "base": null,
      "refs": {
        "CreateWorkflowResponse$uuid": "<p>The universally unique identifier (UUID) value for this workflow.</p>",
        "CreateWorkflowVersionResponse$uuid": "<p>The universally unique identifier (UUID) value for this workflow version.</p>",
        "GetRunResponse$workflowUuid": "<p>The universally unique identifier (UUID) value for the workflow.</p>",
        "GetWorkflowResponse$uuid": "<p>The universally unique identifier (UUID) value for this workflow.</p>",
        "GetWorkflowVersionResponse$uuid": "<p>The universally unique identifier (UUID) value for this workflow version</p>"
      }
    },
    "WorkflowVersionArn": {
      "base": null,
      "refs": {
        "CreateWorkflowVersionResponse$arn": "<p>ARN of the workflow version.</p>",
        "GetWorkflowVersionResponse$arn": "<p>ARN of the workflow version.</p>",
        "WorkflowVersionListItem$arn": "<p>ARN of the workflow version.</p>"
      }
    },
    "WorkflowVersionDescription": {
      "base": null,
      "refs": {
        "CreateWorkflowVersionRequest$description": "<p>A description for this workflow version.</p>",
        "GetWorkflowVersionResponse$description": "<p>Description of the workflow version.</p>",
        "UpdateWorkflowVersionRequest$description": "<p>Description of the workflow version.</p>",
        "WorkflowVersionListItem$description": "<p>The description of the workflow version.</p>"
      }
    },
    "WorkflowVersionList": {
      "base": null,
      "refs": {
        "ListWorkflowVersionsResponse$items": "<p>A list of workflow version items.</p>"
      }
    },
    "WorkflowVersionListItem": {
      "base": "<p>A list of workflow version items.</p>",
      "refs": {
        "WorkflowVersionList$member": null
      }
    },
    "WorkflowVersionListToken": {
      "base": null,
      "refs": {
        "ListWorkflowVersionsRequest$startingToken": "<p>Specify the pagination token from a previous request to retrieve the next page of results.</p>",
        "ListWorkflowVersionsResponse$nextToken": "<p>A pagination token that's included if more results are available.</p>"
      }
    },
    "WorkflowVersionName": {
      "base": null,
      "refs": {
        "CreateWorkflowVersionRequest$versionName": "<p>A name for the workflow version. Provide a version name that is unique for this workflow. You cannot change the name after HealthOmics creates the version. </p> <p>The version name must start with a letter or number and it can include upper-case and lower-case letters, numbers, hyphens, periods and underscores. The maximum length is 64 characters. You can use a simple naming scheme, such as version1, version2, version3. You can also match your workflow versions with your own internal versioning conventions, such as 2.7.0, 2.7.1, 2.7.2.</p>",
        "CreateWorkflowVersionResponse$versionName": "<p>The workflow version name.</p>",
        "DeleteWorkflowVersionRequest$versionName": "<p>The workflow version name.</p>",
        "GetRunResponse$workflowVersionName": "<p>The workflow version name.</p>",
        "GetWorkflowVersionRequest$versionName": "<p>The workflow version name.</p>",
        "GetWorkflowVersionResponse$versionName": "<p>The workflow version name.</p>",
        "RunListItem$workflowVersionName": "<p>The name of the workflow version.</p>",
        "StartRunRequest$workflowVersionName": "<p>The name of the workflow version. Use workflow versions to track and organize changes to the workflow. If your workflow has multiple versions, the run uses the default version unless you specify a version name. To learn more, see <a href=\"https://docs.aws.amazon.com/omics/latest/dev/workflow-versions.html\">Workflow versioning</a> in the <i>Amazon Web Services HealthOmics User Guide</i>.</p>",
        "UpdateWorkflowVersionRequest$versionName": "<p>The name of the workflow version.</p>",
        "WorkflowVersionListItem$versionName": "<p>The name of the workflow version.</p>"
      }
    }
  }
}
