hi guys, I have some points on an image in pixels (u,v). I fit a line through it using least squares fit. But having trouble in determining the starting and endpoints of the line. can you guys suggest any possible way I should go about it? thanks a lot -- aiden

0 |

1/24/2011 9:49:03 AM

Yes, I meant starting and end points of dataset. Im not asking anyone to do it....just asking for some advice. My issue is that i will have horizontal and vertical lines. For example, a vertical line will have same y coordinate , and horizontal, same x coordinate. I not sure if using min(x), max(x) would suffice. --aiden

0 |

1/24/2011 6:59:05 PM

"Aidy" wrote in message <ihki5p$88i$1@fred.mathworks.com>... > Yes, I meant starting and end points of dataset. Im not asking anyone to do it....just asking for some advice. > > My issue is that i will have horizontal and vertical lines. > > For example, a vertical line will have same y coordinate , and horizontal, same x coordinate. I not sure if using min(x), max(x) would suffice. How do you perform the least-squares fitting? Just to be sure whereas you have a method symmetric to u and v or not. Bruno

0 |

1/24/2011 7:41:04 PM

"Aidy" wrote in message <ihki5p$88i$1@fred.mathworks.com>... > Yes, I meant starting and end points of dataset. Im not asking anyone to do it....just asking for some advice. > > My issue is that i will have horizontal and vertical lines. > > For example, a vertical line will have same y coordinate , and horizontal, same x coordinate. I not sure if using min(x), max(x) would suffice. > You need to appreciate that a perfectly vertical line will fail when you try the regression. Infinite slopes tend to cause problems. If this is an issue, then perhaps you need to use an orthogonal regression line. You will also need to learn to work with a line in a general parametric point-slope form. P(t) = P0 + P1*t where P0 and P1 are vectors of length 2. John

0 |

1/24/2011 7:45:04 PM