|
 Originally Posted by wufwugy
The return on each dollar change is marginal. I'm not sure why the firm assessed that if it just invested more it would profit more.
Because that's how you determine a price in a situation like this. You can't just try to pinpoint what you think customers are willing to pay. That's a dangerous game when your product is inelastic.
It goes...
([total capital investment] x [fair rate of return]) + Operating Expenses = Total Revenue
Total Revenue / Kilowatt Hours produced = Price for Electricity.
Why did they do that? Did they assess they would make the same rate of return on less quality investments? Why would they make that assessment?
See the equation above. Invested capital has a rate of return. More capital invested at the same rate of return = more actual dollars
If it is the case that the firm assessed returns based on expectations about what the oversight board would do (it sounds like they did, but you can correct me on that if they didn't),
That's about right.
then that explains some important perverse incentive problems that contributed to that weird investment strategy.
What's the perverse incentive?
Are you talking about the project managers who have a mandate to spend, rather than save?
|