What is Elastic Transcoder

This recipe explains what is Elastic Transcoder

What is Elastic Transcoder?

Elastic Transcoder is an AWS service that converts media files stored in an S3 bucket into media files supported by various devices.

 

Elastic Transcoder is a cloud-based media transcoder.

 

It is used to convert media files from their original source format into formats that can be played on smartphones, tablets, PCs, and other devices.

 

o It includes transcoding presets for popular output formats, so you don't have to guess which settings will work best on which devices.

 

If you use Elastic Transcoder, you must pay based on the number of minutes and the resolution at which you transcode.

 

Components of Elastic Transcoder

Elastic Transcoder is made up of four parts:

 

     Jobs

 

     Pipelines

 

     Presets

 

     Notifications

 

 
   
  • Jobs
   

The primary responsibility of the job is to complete the transcoding work. Each job can convert a file into up to 30 different formats. A single job, for example, creates files in eight different formats if you want to convert a media file into eight different formats. When you create a job, you must specify the name of the file to be transcoded.

   

   
  • Pipelines
   

Pipelines are the queues that contain your transcoding jobs. When you create a job, you must specify which pipeline you want to add your job to. If you specify multiple formats in a job, Elastic Transcoder creates files for each format in the order you specify.

    You can create either of the two pipelines, namely standard-priority jobs or high-priority jobs. The majority of jobs fall into the standard-priority category. When you need to transcode a file quickly, you can use the high-priority pipeline.
   

   
  • Presets
   

Presets are templates that include the settings for converting a media file from one format to another. Elastic transcoder includes some standard presets for common formats. You can also make your own presets that aren't in the default presets. When you create a job, you must specify a preset that you want to use.

   

   
  • Notifications
   

Notification is an optional field that can be set using the Elastic Transcoder. Notification Service is a service that keeps you up to date on the status of your job, such as when Elastic Transcoder begins processing your job, when Elastic Transcoder finishes its job, and whether or not Elastic Transcoder encounters an error condition. When you create a pipeline, you can configure Notifications.

   
 

Features

 
   
  • Easy to use
   

Amazon Elastic Transcoder is intended to be simple to use. To begin, use the AWS Management Console, the service API, or the SDKs. Transcoding presets in the system make it simple to get transcoding settings right the first time. We offer pre-defined presets for creating media files that will play on a variety of devices (such as smartphones and tablets), as well as presets for creating media files that are optimised for playback on a specific device (like the Amazon Kindle Fire HD or Apple iPod touch). You can also create segmented files and playlists for delivery to compatible devices via the HLS, Smooth, or MPEG-DASH protocols. Developers creating transcoding-required applications can use the AWS SDKs for Java,.NET, Node.js, PHP, Python and Ruby, and the new AWS Command Line Interface.

   

   
  • Elastically Scalable
   

Amazon Elastic Transcoder is built to scale with your media transcoding workload. Amazon Elastic Transcoder is designed to handle large amounts of media files as well as large file sizes. Transcoding pipelines allow you to perform multiple transcodes at the same time. To provide scalability and reliability, Amazon Elastic Transcoder makes use of other Amazon Web Services such as Amazon S3, Amazon EC2, Amazon DynamoDB, Amazon Simple Workflow (SWF), and Amazon Simple Notification Service (SNS).

   

   
  • Cost Effective
   

Amazon Elastic Transcoder has a content duration-based pricing model, which means you pay based on the length of the output media in minutes. For example, if the transcoded output of your video is 30 minutes long, you will be charged for 30 minutes of transcoding. Similarly, if you make a 20-minute video clip out of a 30-minute input file, you will be charged for 20 minutes of transcoding. Alternatively, if you combine two 5 minute input files to make a single 10 minute output file, you will be charged for 10 minutes of transcoding. There are no minimum transcoding volumes, monthly commitments, or long-term contracts with Amazon Elastic Transcoder.

   

   
  • Managed
   

Amazon Elastic Transcoder allows you to concentrate on your content rather than managing transcoding software in a distributed cloud environment. The service manages the process of keeping codecs up to date as well as scaling and operating the system. When combined with our service API and SDKs, this makes it simple to create media solutions that utilise Amazon Elastic Transcoder.

   

   
  • Secured
   

Your content is in your hands: your assets are stored in your own Amazon S3 buckets, which you grant us access to via IAM roles. This makes it simple to integrate into your existing security and identity framework without sacrificing control. We used security best practises learned while building other Amazon Web Services to build Amazon Elastic Transcoder. Please visit the AWS Security Center for more information on AWS security. AWS Compliance has more information on compliance, including MPAA best practises.

   

   
  • Seamless Delivery
   

You can store, transcode, and deliver your content using Amazon Elastic Transcoder, Amazon S3, and Amazon CloudFront. It is now a simple one-step process to transcode content with Amazon Elastic Transcoder and deliver the multiple output videos via progressive download or adaptive bitrate streaming (HLS, Smooth, or MPEG-DASH) with CloudFront by setting the S3 permissions for your CloudFront distribution in Amazon Elastic Transcoder.

   

   
  • AWS integration
   

Amazon Elastic Transcoder is a critical media building block for AWS end-to-end media solutions. For example, you can use Amazon Glacier to store master content, Amazon Elastic Transcoder to convert masters to renditions for distribution stored in Amazon S3, Amazon CloudFront to stream these renditions at scale over the Internet, and CloudWatch to monitor the health of your transcoding workflow.

   
 

What Users are saying..

profile image

Ray han

Tech Leader | Stanford / Yale University
linkedin profile url

I think that they are fantastic. I attended Yale and Stanford and have worked at Honeywell,Oracle, and Arthur Andersen(Accenture) in the US. I have taken Big Data and Hadoop,NoSQL, Spark, Hadoop... Read More

Relevant Projects

Log Analytics Project with Spark Streaming and Kafka
In this spark project, you will use the real-world production logs from NASA Kennedy Space Center WWW server in Florida to perform scalable log analytics with Apache Spark, Python, and Kafka.

PySpark ETL Project for Real-Time Data Processing
In this PySpark ETL Project, you will learn to build a data pipeline and perform ETL operations for Real-Time Data Processing

Create A Data Pipeline based on Messaging Using PySpark Hive
In this PySpark project, you will simulate a complex real-world data pipeline based on messaging. This project is deployed using the following tech stack - NiFi, PySpark, Hive, HDFS, Kafka, Airflow, Tableau and AWS QuickSight.

SQL Project for Data Analysis using Oracle Database-Part 7
In this SQL project, you will learn to perform various data wrangling activities on an ecommerce database.

Yelp Data Processing Using Spark And Hive Part 1
In this big data project, you will learn how to process data using Spark and Hive as well as perform queries on Hive tables.

Build an ETL Pipeline with Talend for Export of Data from Cloud
In this Talend ETL Project, you will build an ETL pipeline using Talend to export employee data from the Snowflake database and investor data from the Azure database, combine them using a Loop-in mechanism, filter the data for each sales representative, and export the result as a CSV file.

Build an ETL Pipeline for Financial Data Analytics on GCP-IaC
In this GCP Project, you will learn to build an ETL pipeline on Google Cloud Platform to maximize the efficiency of financial data analytics with GCP-IaC.

Build a Data Pipeline with Azure Synapse and Spark Pool
In this Azure Project, you will learn to build a Data Pipeline in Azure using Azure Synapse Analytics, Azure Storage, Azure Synapse Spark Pool to perform data transformations on an Airline dataset and visualize the results in Power BI.

EMR Serverless Example to Build a Search Engine for COVID19
In this AWS Project, create a search engine using the BM25 TF-IDF Algorithm that uses EMR Serverless for ad-hoc processing of a large amount of unstructured textual data.

Airline Dataset Analysis using PySpark GraphFrames in Python
In this PySpark project, you will perform airline dataset analysis using graphframes in Python to find structural motifs, the shortest route between cities, and rank airports with PageRank.