Power consumption instantaneously varies depending on how the screen is being utilized. Each pixel has red, blue, and green phosphor elements, which are energized together in varying intensities to create the range of visual color. Black takes the least power, all phosphors idle. White takes the most power, all phosphors lit. Any given, instantaneous picture is a combination of pixels generating an array of colors at range of brightness. Maximum power consumption would be a full, white screen.
I put a Kill-a-Watt meter on my 42" plasma a few mins ago for a look-see. It's 10 years old as of 9/28/2012, so likely is less "efficient" than the latest units, although there may not be all that much improvement over the years being as plasma display technology is what it is. The label on the unit does not reference wattage, but does state 120v, 5.0 amps, which would be the maximum, which is 600 watts, which would only occur on a full-white screen at maximum brightness.
It's not a good idea to set the brightness/contrast at maximum. Makes for an unnatural picture and "wears" the phosphors faster.
Monitoring the meter for a few minutes on a typical local ABC News broadcast, the lowest was about 187 watts, with a spike to 325. Average hangs between about 200 and 275, which is equivalent to 2 to 3-1/4 100-watt light bulbs.
A larger screen would of course pull accordingly more power.
Yes, they generate some heat ... I suppose equivalent to a CRT of the same size. Plasma IS a form of CRT technology. Instead of a single focused electron beam firing at and moving across the screen from a fixed point, each pixel contains three miniature electron "beams" that fire directly inside the pixel.