Least-squares fitting a square pulse
I am trying to fit a square pulse to data in the least-squares sense in Matlab. The parameters that are being optimized are the height, start, and end - x(1), x(2) and x(3) in the program, respectively. However, the least-squares program is only varying the height, for some reason. Why is this occurring? How else could I fit a square pulse using least-squares?
Although in this example the pulse would be easy to fit by inspection, for the purpose of my project, it is not an option.
ydata=-.05+.1*rand(1,numel(xdata))+heaviside1(xdata-1)-heaviside1(xdata-4);...has anyone coded the Deming regression model for least-squares fitting into IDL?
I was wondering if anyone had coded the Deming regression model for
least-squares fitting into IDL? If so, would you be willing to make it
Thanks in advance, Pete Riley
...Adding constraints to linear fit parameters of separable least squares fit
I was wondering if anyone can help me to figure out how to add equality and inequality constraints to the linear parts of a separable least squares fit. I saw fminspleas is able to do it for the nonlinear fit parameters (using transformations from a constrained problem to an unconstrained one), but not for the linear fit parameters. Here is my code for the separable least squares fit. Thanks.
f...Least square fitting
I have this graph:
x = [0.000235 0.00025 0.0003 0.0004 0.0005 0.0006 0.00075]
y = [54.9 43.1 62.7 86.3 86.3 90.2 96.1]
plot (x, y)
ylabel ('Correct Detection [%]')
legend ('Psychometric Function')
and I want to fit it to this equation:
P(A, 'theta') = [1/sqrt(2*pi*sigma²)] * integral e^-[(A-'theta'²)/2*sigma²]
how can i do this?
"xety89 " <firstname.lastname@example.org> wrote in message <email@example.com>...
> I have this graph...least squares fit
I'm trying to solve a system of 5+ equations for 3 unknowns,
preferably using the least-squares method. I've Solve will not
return solutions for more equations than you have unknowns. Fit does
not work because my equations have 4 sets of data.
Si=a*Ui + b*Vi + c*Wi, where i goes from 1 to 5
U, V, and W all come from a table, and S is from experimental data.
I need to know how to solve for a,b,c given more than 3 equations.
Background: The problem is actually trying to solve for Judd-Ofelt
parameters of a laser crystal. S is the experimental line strength and
U, V...Least Square Fit
I tried to use lsqlin to do a least square fit with 3 constraints. I
had 7 variables. However it didn't seem to take into account the last
I am trying to use fmincon but keep getting an error 'Inner matrix
dimensions must agree'- I checked the dimensions of the constraint
and Obj function matrices.but Ican't figure out why this is
In article <firstname.lastname@example.orgYaTP>,
"Shar P" <email@example.com> writes:
>I tried to use lsqlin to do a least square fit with 3 constraints. I
>had 7 variables. However it ...Least square fitting
I have a problem with a rather complicated function depending on four
parameters which I try to find using least square fitting and I don't
know exactly how to do it.
The basic problem is the following:
I have an astronomical image of a star field and try to relate the sky
coordinates (right ascension, declination) of the stars to the pixel
coordinates (x, y).
The function to relate the two depends on the not accurately known
parameters focal length of the lens (f), the rotation of the field of
view with respect of the north direction (beta) and the center sky
coordinates of the...least square fitting
dear Gnuplot users,
I am using the following command to fit a set of data:
fit f(x) 'dataset' via a,b
then I get the value of a and b with only one decimal digit like a=0.1,
how can I get five decimal digits like a=0.00001, b=0.91112 ?
David <firstname.lastname@example.org> wrote:
> dear Gnuplot users,
> I am using the following command to fit a set of data:
> fit f(x) 'dataset' via a,b
> then I get the value of a and b with only one decimal digit like a=0.1,
"get the value&q...Least squares fitting
I am working on an inverse problem where I need to use least squares fitting.
I am working on characterization of impedance of a piezoelectric crystal. A PZT crystal can be modeled as a series RLC circuit in parallel with a parasitic capacitor. Thus, I can derive the transfer function of its impedance and admittance and simulate it (see the code below)
I am trying to extract R,L,C and C0 parameters from this simulated admittance curves using least squares fitting method.( Also, I need to inversely calculate these from the admittance curves that are constructed using ...least square fit #4
I tried to fit data to get the sig and mean parameters of the
gaussian distribution. I found lsqcurvefit can only get the correct
fitting result of sig but not for the mean. The mean is always the
initial value you set for the fitting.
Any input is highly appreciated.
My code is very simple. Is there something wrong with it?
options=optimset('MaxFunEvals',1000,'MaxIter&...Least square curve fitting
I need some help with a curve fitting problem for data points.
I have a set of data points
g=[g1 g2 g3 g4 g5]
h=[h1 h2 h3 h4 h5]
and would like to fit these data to an equation y=ax/(b+x)
in least square sense, with initial guesses a=170 and b =
1000 until R2 correlation is 95%.
"Alison " <email@example.com> wrote in message
> Hi all,
> I need some help with a curve fitting problem for data
> I have a set of data points
> g=[g1 g2 g3 g4 g5]
> h=[h1 h2 h3 h4 h...Which is the fastest least squared fit?
I have a program that fits measured data to a function with a least squares fit. The measured data contains about 30 data points, and the fitting function has 5 parameters in total:
y(x) = c1 * cos(c2*x + c3) * sin(c4*x)/x * exp(- c5*x)
I also want to apply boundaries to the constants. I have tried several fitting algorithms, e.g. fminsearch and lsqnonlin. The trouble is, that since the fit is performed several thousands of times each time the program is executed, it is important that the function provides a fast fit. I have discovered that lsqnonlin is faster than fminsearch, but execution ...Uncertainty in least square fits
I am using polyfit for a set of my data points which is
fitted to a parabolic curve as y=p2*x^2+p1*x+p0. I am not
quite sure how to extract the uncertainties associated with
each of these polynomials. I know the algorithm from the
least square fit procedure, but I'd like to know if Matlab
has a built-in function for this. Eventually, I would have
to fit the data points using a high-order polynomial, so it
would be useful to know how to estimate the errors at this
"Woo-Joong Kim" <firstname.lastname@example.org> wrote in message <fe1fj9