I've been curious for a long time about the effect of a wicket in T20 cricket. Here is a way to think about it.
As an example, if you look at the yellow box in the two tables, it shows the following:
1. There have been 24 instances in which the team batting first has lost 3 wickets at the start of the 8th over of its innings.
2. In instances where the team lost 3 wickets at the start of the 8th over, it scores on average 6.3 runs in that over.
The table does not however show the effect of a partnership. For example, the run rate achieved by teams that have lost 3 wickets at the start of the 9th over does not depend on the run rate achieved at the start of the 8th over. The table, in effect, reveals the effect of the resources demonstrated by a team given its position at the start of an over in the match.
In the table immediately above, I attempt to compute the cost of losing a wicket in a particular over in a T20 game, batting first. The column "RR" to the extreme right of the table gives the average number of runs scored in a give over (disregarding how many wickets a team has lost at the start of the over).
A quick comparison with the wickets lost columns (as you go from left to right for each over, the run rate generally drops, as you would expect it to as the number of wickets lost at the start of a given over increases from left to right), provides a threshold number of wickets lost (shown in orange), and a threshold total achieved by a side (about 143, in the orange box at the bottom left of the table). Basically, what this suggests is that 143 is a par total in the average T20 game. Losing a given wicket earlier than the orange trajectory of the innings suggests, results in lower average expected totals (at least some, if not all the overs would be drawn from the red figures in this case).
You will see how such a method would enable you to derive an expected total while an innings is in progress based on how wickets have been lost at the start of a given over. A normalized figure instead of an absolute average may well work better (so that expected totals may be arrived at more realistically for a particular set of conditions).
This could also be used to determine available resources (and therefore, targets) in rain affected scenarios.
The most promising way to use this from my point of view would be to use this to measure the cost of a risk that didn't come off. Batsmen are always taking risks against bowlers in T20 games. What is the value of a risk coming off for the batsmen, versus it not coming off? So for example, if a team is 4 wickets down at the start of the 15th over. It would be expected to score 9 runs in the 15th over according to this table. A more sophisticated version of this table is possible with more data and more time - one which would provide separate figures for what the expected RPO would be if a wicket was lost in the 15th over, as opposed to if two wickets were lost, as opposed to if no additional wickets were lost. But using this basic table, losing a wicket in the 15th over (given that a team is already 4 down), appears to cost the batting side 7 runs in the end (assuming no further loss).
Given this data, the best score of the average T20 side (top 8 Test playing teams playing amongst themselves) batting first is 198/2, while the worst score, assuming they don't get bowled out, is 103/9.
I encourage you to consider the method proposed here. These figures would be far more robust given a bigger set of games. I wonder if including all T20 games above a certain standanrd (say played by first class cricketers and by non-first class teams that have nevertheless qualified for T20 World Cups) would be a reasonable data set. In this particular instance, I have chosen to limit myself to games played by one of the top 8 Test playing teams against another top 8 Test playing team.