Abstract—Most Independent System Operators (ISOs) adopt the Bid Cost Minimization (BCM) to select offers and their respective generation levels while minimizing the total bid cost. It was shown that the customer payment costs that result from selected offers can differ significantly from the customer payments resulting from the Payment Cost Minimization (PCM), under which payment costs are minimized directly. In order to solve the PCM in the dual space, the Lagrangian relaxation and surrogate optimization approach is frequently used. When standard optimization methods, such as branch-and-cut, become ineffective due to the large size of a problem, the Lagrangian relaxation and surrogate optimization approach provides a good feasible solution within a reasonable CPU time. The convergence of the standard Lagrangian relaxation and surrogate subgradient approach depends on the optimal dual value, which is usually unknown. Furthermore, when using the surrogate subgradient approach, the upper bound property is lost, so additional conditions are needed to ensure convergence. The main goal of this paper is to develop a convergent variation of the surrogate subgradient method without invoking the optimal dual value, and show the relevance and effectiveness of the new method for solving large constrained optimization problems, such as the PCM.

I. INTRODUCTION

C

URRENTLY, most ISOs in the United States adopt the bid cost minimization (BCM) settlement mechanism for minimizing total offer costs. In this setup, customer payment costs, which are determined by a mechanism that assigns uniform market clearing prices (MCPs), are different from the minimized offer costs. An alternative method to determine customer payment costs is to ...

... middle of paper ...

...amic step size [8]. Perhaps the most recent and exhaustive survey on the subgradient methods for convex optimization is [2].

The Lagrangian relaxation and surrogate subgradient optimization approach was specifically treated in [6] and [5]. The former paper develops the surrogate subgradient method and proves its convergence. Compared to subgradient and gradient methods, the surrogate subgradient approach finds better and smoother directions within less CPU time. The latter paper extends the methodology to solving coupled problems. Since the optimal dual value or the optimal multipliers remain unknown, there is the need for development of the surrogate subgradient method, the convergence of which does not depend on the optimal dual value.

The convergence of the surrogate subgradient method with dynamic or constant stepsize still remains an open question.

I. INTRODUCTION

C

URRENTLY, most ISOs in the United States adopt the bid cost minimization (BCM) settlement mechanism for minimizing total offer costs. In this setup, customer payment costs, which are determined by a mechanism that assigns uniform market clearing prices (MCPs), are different from the minimized offer costs. An alternative method to determine customer payment costs is to ...

... middle of paper ...

...amic step size [8]. Perhaps the most recent and exhaustive survey on the subgradient methods for convex optimization is [2].

The Lagrangian relaxation and surrogate subgradient optimization approach was specifically treated in [6] and [5]. The former paper develops the surrogate subgradient method and proves its convergence. Compared to subgradient and gradient methods, the surrogate subgradient approach finds better and smoother directions within less CPU time. The latter paper extends the methodology to solving coupled problems. Since the optimal dual value or the optimal multipliers remain unknown, there is the need for development of the surrogate subgradient method, the convergence of which does not depend on the optimal dual value.

The convergence of the surrogate subgradient method with dynamic or constant stepsize still remains an open question.

Related

- Powerful Essays
## Analysis Of Multi-Objective Programming Problem

- 2369 Words
- 10 Pages

In other words, a solution to a multi-objective programming problem is said to be efficient or non-inferior or Pareto optima if it is not possible to improve some objective function values at expense of others. Such solutions are infinitely many and so interest is always on generating some of them. Generally, solution methods of multi-objective programming problems may be classified into three groups, viz : priori, interactive, and posterior methods (Hwang and Masud, 1979). In a priori method, the decision maker states his preferences before the optimization procedure attaching weights on each objective function. The pitfall of this method is that it is not easy for the decision maker to accurately quantify his preferences prior to optimization (Mavrotas,

- 2369 Words
- 10 Pages

Powerful Essays - Good Essays
He says that it is too easy to explain the findings from dual task experiments in terms of central capacity. So if two tasks can be performed simultaneously it's because they don't exceed the central capacity and if they cannot be performed then it is because the combined effort needed to accomplish the tasks exceeds the capacity. This may be the case, but there is no independent definition of the central processing capacity. So, task difficulty cannot be defined. The argument is circular as difficult tasks require more attention and tasks that require more attention are difficult.

- 992 Words
- 4 Pages

Good Essays - Satisfactory Essays
## svm

- 892 Words
- 4 Pages

In spite of the good performance, the wrapper methods have had restricted employment because of the high computational intricacy involved. In this paper genetic algorithm (GA) as a filter technique is employed in terms of feature selection purpose to result in a better diagnosis of the stock’s trends. 3.3. Support vector machine (SVM) Support vector machine (SVM), which is based on structural risk minimizations concepts and statistical learning theory, was developed firstly by Vapnik (1995).Two remarkable applications of SVM is pattern recognition (classification) as well as regression estimation (approximation of functions).These applications make SVM as a popular method among researchers to implement it to solve problems such as nonlinear modeling, time series forecasting and etc. In this article, it’s classification algorithm has been applied to forecast the price of each stock in certain time peri... ... middle of paper ... ...diction.

- 892 Words
- 4 Pages

Satisfactory Essays - Powerful Essays
## Informix Revenue Recognition

- 5784 Words
- 24 Pages

Most fraudulent financial reporting schemes involve “earnings management” techniques, which inflate earnings, create an improved financial picture, or conversely, mask a deteriorating one. Premature revenue recognition is one of the most common forms of fraudulent earnings management and the case of Informix Software Inc. unfortunately illustrates closely this practice. The analysis of this case will shed light on issues like: v Informix’s revenue recognition policy prior to 1990 and its compliance with FASB Concept #5, FASB Statement #86, GAAP protocols. v Informix’s reactions to AICPA SOP in changing the revenue recognition procedures and Informix’s reason to prematurely and voluntarily implement the new policy v The changes that took place at Informix and the financial results reported during 1990 Furthermore, we will also evaluate the software industry practices and the regulations in place at that time. We conclude with lessons learnt and recommendations towards identifying and discouraging non-GAAP revenue recognition practices.

- 5784 Words
- 24 Pages

Powerful Essays - Good Essays
## Price Elasticity Of Demand Essay

- 1828 Words
- 8 Pages

The concept of Price Elasticity of Demand (PED) measures the responsiveness of quantity demanded by consumers to a change in product price. It is used by businesses to forecast sales, set the most effective price of goods and determine total revenue (TR) and total expenditure (TE). Similarly, governments also use price elasticity of demand when imposing indirect taxes on goods and setting minimum and maximum prices. Marginal revenue is also determined by the price elasticity of demand. Price elasticity of demand is used to predict the quantity shift in the supply curves and the effect on price for a product, and is usually always negative as it is the relationship between price and quantity demanded is an inverse one.

- 1828 Words
- 8 Pages

Good Essays - Powerful Essays
## Physiotherapy Outcome Measures

- 1683 Words
- 7 Pages

The Bland and Altman 95% limits of agreement (LOA) were -2.7 to 4.7. In conclusion, despite this outcome measure demonstrated excellent test retest reliability, lack of appropriate sample size may decrease the reproducibility of this result. Consequently, a further research is required with an appropriate sample size to draw a definite conclusion. Introduction Reliability of an outcome measurement reflects how reproducible or repeatable the measurement is under a given set of circumstances. For an outcome measurement to be useful, it must provide stable or reproducible values with small errors of measurement when no variable is influencing the attribute that the measurement is quantifying (Rankin and Stoke 1998).

- 1683 Words
- 7 Pages

Powerful Essays - Better Essays
## Solving Pole-Balancing Problem with POMDP

- 1002 Words
- 5 Pages
- 5 Works Cited

Partially Observable Markov Decision Processes (POMDP) extends the MDP framework to include states those are not fully observable. With this extension, we are able to model more practical problems, while the solution methods that exist for MDP will no longer be applicable. The computational intensive of POMDP algorithms is much more than that of MDP. This complexity is due to the uncertain about the true state, which leads to a probability distribution over the states. So POMDP algorithms are dealing with probability distributions, while MDP algorithms are working on a finite number of discrete states.

- 1002 Words
- 5 Pages
- 5 Works Cited

Better Essays - Good Essays
My argument is: putting in mind that we want to measure the significance in the difference of performance between the models/macro sets, and given that the process switching time of current operating systems is non zero, we should make such an assumption. This is because there will be a small ove... ... middle of paper ... ...h instances, and it was hard to avoid generating such instances for the test. It is not possible to completely control the output of a random problem generator, and the mprime problems were either relatively easy or extremely hard. So, the only way I found to make things more fair in the comparison was to apply the upper bound on the perfect model as discussed above. This method was very effective in showing that the perfect model is superior compared to the other macros/model.

- 932 Words
- 4 Pages

Good Essays - Better Essays
## A Cost Accounting System Analysis

- 1059 Words
- 5 Pages

It also stunts any scope for improvement or innovation as it is too focused on sticking to the set benchmarks. This often leads to poor overall performance of the organization in the long run which in turn affects the going concern of the business. Secondly, it utilizes a single, volume-based cost driver which leads to the distortion of the cost of products. It traces overheads to products or services usin... ... middle of paper ... ...osts and where to apply efforts to curb inflationary costs. This can be of particular value in tracking new products or customers and also solves the cross-subsidies problem linked to traditional costing system by separating overhead costs into different cost categories or cost pools.

- 1059 Words
- 5 Pages

Better Essays - Better Essays
## Balance the Load of Multiprocessors by Effective Distribution and Parallelization

- 1275 Words
- 6 Pages
- 10 Works Cited

Many load balancing algorithms are proposed but are not effective and has many flaws. This work is limited to work load balancing and reducing communication time between the participating resources. This is achieved by converting the initial database into TID(which contains all the data in the initial database with reference to items ) then it is passed to all the participating systems thus reducing the communication time by referencing the TID in each participating system. This technique allows both distributed or parallel mechanism to be implemented easily. RELATED WORK Sarra Senhadji, Salim Khiat, and Hafida Belbachir,.

- 1275 Words
- 6 Pages
- 10 Works Cited

Better Essays