0

I am looking to min the square error between a target and the sum-product of my degree of freedom (Portfolio Weights) and Beta. For some reason, I can't figure out why the results are only applying my guess and no iterations are taking place within the solver. Any ideas? I am trying to pass in tempBetas['Weight'] as my degree of freedom for solver, the sum of the weights must equal 1, and the square error between 0 and the sum-product of the weights and Betas must be minimized.

When I run the code nothing complains but there is only one iteration and for some reason, the loop seems to happen N times and not the 2 that I have outlined in the range of the For loop. Thanks in advance for any help!

As it stands now the data that I import into BetaResults looks like this:

|Index | ISIN | 04/30/2016 | 05/30/2016
| 0    | 1111 | 0.3        | 0.35
| 1    | 1112 | 0.4        | 0.42

I am trying to take the Betas from each date, calculate a weighted average such that the difference between the weighted average and my target is minimized to as close to 0 as possible.

    import pandas as pd
    import datetime
    import numpy as np
    import yfinance as yf
    import math
    import scipy

    from scipy.optimize import minimize

    numDates    = len(BetaResults.iloc[0,1:])
    numEquities = len(BetaResults.iloc[:,0]) 

    equalWeight = 1 / numEquities

    allocationWeight = BetaResults[['isin']]

    SolverTarget = 0

    def minSqError(df,a,b):
        print(a,b)
        temp  = a * b
        temp2 = np.sum(temp)
        y = (SolverTarget - temp2)**2    
        return y

    def Weight_constraint(x):
        PortWeight = 1
        loopsize = len(x)
        for i in range(loopsize):
           TotalWeight = PortWeight - x[i]
        return TotalWeight

    for i in range(1,2):
    #-------------------------------------------------------------------------------------------------------------    
        tempBetas                    = BetaResults.iloc[:,i]
        tempBetas                    = pd.DataFrame(tempBetas)
        tempBetas['Weight']          = equalWeight    
    
        portWeight                   = tempBetas['Weight'].sum()
    
        if portWeight != 1:
            print('Portfolio weights do not = 100% |',BetaResults.iloc[i,0], '| Weight=' + str(portWeight))
            break      
    #-------------------------------------------------------------------------------------------------------------        
        cons            = {'type':'eq', 'fun':Weight_constraint}
        Results         = minimize(minSqError,tempBetas['Weight'],args= 
                                  (tempBetas['Weight'],BetaResults.iloc[:,i]), \
                                   constraints = cons)
dabrows
  • 1
  • 2
  • Welcome to SO. Please provide a minimal **reproducible** example, i.e. provide all of your data. Otherwise, it's hard to help properly. That being said, what's the point of the loop in your constraint function? Currently, your constraint basically reads as follows: `TotalWeight = PortWeight - x[len(x)-1]`. – joni Jul 15 '22 at 05:14
  • My syntax is wrong then - the goal of the constraint is to make sure that the sum of the degree of freedom for the solver is equal to 1. – dabrows Jul 15 '22 at 11:26

0 Answers0