Originally Posted by Nigel Goodwin:
“You don't think all the work is done inside the PC?.
And USB isn't limited to 0.5W anyway.”
Like it or not c4rv nailed it on the head, if a usb peripheral exists then the power consumption is by default negligible. You worry about tuning power, but that is proven to be next to nothing, you cannot then point to processing in the pc because all the pc would possibly do is decode the video codec, and dedicated chipsets do that far more efficiently than pc's now(assuming no gpu decode), which is why hand held devices like iphones and ipads can play hd video without melting in your hand. The fact that the ipad can playback video for something like 10 hours is due to the efficiency of dedicated video codec hardware, much the same as with any device playing video these days, such embedded chips have been common for years now, so I don't get why you believe that somehow decoding video is a super big thing anymore. Every manufacturer or the 3rd party suppliers making the dedicated chips have been working on this very problem for over many years now.
Its such a passe concern that they've even managed to create low power video encoding chips for hd for quite some time now, and encoding video is many fold harder a problem to solve, so you seem to be working with an impression of tech that is kind of out of date.
Looking at this 2009
https://docs.google.com/viewer?a=v&q...94T4jsiGWTgXRw document on a tv system on a chip shows the entire tv control chip including video decode is less than 7 watts. They've integrated more and more functions onto a single chip which they call tv system on a chip, and it has only gotten more extreme now in 2012...all manufacturers have their own version of this or source it from a 3rd party.