I have been researching feasibility of running computer vision applications in cloud and costs involved. Here is an article I came across,
http://radar.oreilly.com/2013/12/how-to-analyze-100-million-images-for-624.html
Here are couple of quotes from the article. Regarding processing:
“I use m1.xlarge servers on Amazon EC2, which are beefy enough to process two million Instagram-size photos a day, and only cost $12.48! I’ve used some open source frameworks to distribute the work in a completely scalable way, so this works out to $624 for a 50-machine cluster that can process 100 million pictures in 24 hours. That’s just 0.000624 cents per photo!”
Regarding fetching the data:
“If you have three different external services you’re pulling from, with four servers assigned to each service, you end up taking about 30 days to download 100 million photos. That’s $4,492.80 for the initial pull, which is not chump change, but not wildly outside a startup budget, especially if you plan ahead and use reserved instances to reduce the cost.”