I don't think you'll want to start a new FFMPEG instance every time someone uploads a file for transcoding. Instead, you'll probably want to start the same number of FFMPEG processes as the number of CPUs you have, then queue up the input files you want to transcode and do them in the order they were received. You could do this all on one computer, I don't think the server that accepts the uploads and puts them in the queue will need take much CPU and can probably coexist just fine with the FFMPEG processes.
I don't think you'll want to start a new FFMPEG instance every time someone uploads a file for transcoding. Instead, you'll probably want to start the same number of FFMPEG processes as the number of CPUs you have, then queue up the input files you want to transcode and do them in the order they were received. You could do this all on one computer, I don't think the server that accepts the uploads and puts them in the queue will need take much CPU and can probably coexist just fine with the FFMPEG processes.
Depending on how big you want to scale to (if you want to do more than just a few FFMPEG processes on a single machine) you could easily make this distributed, and this is where SQS would come in handy. You could run 1 FFMPEG process per core, and instead of looking in a local queue for the data, it could look to the SQS. Then you could instantiate as many transcoding processes as you need, on different machines.
The downside to this, is that you will need to transfer the raw videos from the server that accepts them to the server that needs to transcode them. You could put them in S3 then grab them out of S3, but I don't remember off the top of my head if you have to pay for that. Alternatively, you could just keep them on the hard disk of the machine that received them, and have the transcoding process go there to get the raw files.
Great thanks for your response. I am now at the stage where I have the server that handles the uploads calling FFMPEG to process the uploaded video, then write the encoded file to Amazon S3. While this is happening though the scripts wait until all processes have finished, ie the user has to wait for the video to encode before the next video uploads etc.I agree with you that I can probably manage the uploading and encoding on a single machine but how do you suggest I run the transcoding in the background, and how do I detect when a file has been transcoded to copy it to S3?
Thanks again – undefined Oct 2 '09 at 15:12 So you have a process that is doing the transcoding, can't the same process just put it in S3 when it is done with it? Maybe when the web-facing app kicks off the transcoding process, it can pass in an argument that tells the transcoding process where in S3 to put it. – teeks99 Oct 3 '09 at 13:18.
I cant really gove you an answer,but what I can give you is a way to a solution, that is you have to find the anglde that you relate to or peaks your interest. A good paper is one that people get drawn into because it reaches them ln some way.As for me WW11 to me, I think of the holocaust and the effect it had on the survivors, their families and those who stood by and did nothing until it was too late.