f



Generalized Cross Product

I have a pair of N-dimensional vectors, N large (hundreds), and I need to g=
enerate another N-dimensional vector that is orthogonal to the first two. I=
t doesn't have to be unique; it doesn't even have to be perfectly orthogona=
l, as long as the normalized dot products are less than, oh, about 0.005.

I've searched for "generalized cross product" and similar terms, and all I =
found was a lot of statements that the cross product only exists for N=3D3 =
and N=3D7, philosophical discussions of what the cross product really means=
, etc., but nowhere did I find a simple way to find an N-dimensional vector=
 approximately orthogonal to two other N-dimensional vectors.

Help and guidance requested and appreciated.

Greg
0
Greg
12/8/2016 1:47:24 PM
comp.dsp 20333 articles. 1 followers. allnor (8509) is leader. Post Follow

17 Replies
221 Views

Similar Articles

[PageSpeed] 30

On 2016-12-08 13:47:24 +0000, Greg Berchin said:

> I have a pair of N-dimensional vectors, N large (hundreds), and I need 
> to generate another N-dimensional vector that is orthogonal to the 
> first two. It doesn't have to be unique; it doesn't even have to be 
> perfectly orthogonal, as long as the normalized dot products are less 
> than, oh, about 0.005.
> 
> I've searched for "generalized cross product" and similar terms, and 
> all I found was a lot of statements that the cross product only exists 
> for N=3 and N=7, philosophical discussions of what the cross product 
> really means, etc., but nowhere did I find a simple way to find an 
> N-dimensional vector approximately orthogonal to two other 
> N-dimensional vectors.
> 
> Help and guidance requested and appreciated.
> 
> Greg

Pick a random vector. Project out the two vectors you have. The result will be
orthogonal (except for bad luck when it is zero).

It is called Gram-Schmidt othogonalization.

0
Gordon
12/8/2016 3:55:01 PM
On Thursday, December 8, 2016 at 9:55:04 AM UTC-6, Gordon Sande wrote:

> It is called Gram-Schmidt othogonalization.

Thanks, that looks like what I need. You know, all the searching that I did for orthogonal vectors, and G-S didn't come up once in any of the results.

0
Greg
12/8/2016 5:13:09 PM
On Thu, 08 Dec 2016 11:55:01 -0400, Gordon Sande wrote:

> On 2016-12-08 13:47:24 +0000, Greg Berchin said:
> 
>> I have a pair of N-dimensional vectors, N large (hundreds), and I need
>> to generate another N-dimensional vector that is orthogonal to the
>> first two. It doesn't have to be unique; it doesn't even have to be
>> perfectly orthogonal, as long as the normalized dot products are less
>> than, oh, about 0.005.
>> 
>> I've searched for "generalized cross product" and similar terms, and
>> all I found was a lot of statements that the cross product only exists
>> for N=3 and N=7, philosophical discussions of what the cross product
>> really means, etc., but nowhere did I find a simple way to find an
>> N-dimensional vector approximately orthogonal to two other
>> N-dimensional vectors.
>> 
>> Help and guidance requested and appreciated.
>> 
>> Greg
> 
> Pick a random vector. Project out the two vectors you have. The result
> will be orthogonal (except for bad luck when it is zero).
> 
> It is called Gram-Schmidt othogonalization.

I think you can do something similar with QR decomposition (actually, I 
think QR may have Gram-Schmidt somewhere inside).

Given who you are you must be aware that in an N-dimensional space you 
can make a set of N mutually orthogonal vectors.  So if three orthogonal 
vectors is magic because there's three spatial dimensions, but you're 
working with N-dimensional vectors, there may be a mismatch between your 
problem and whatever solution you're thinking of.

-- 
Tim Wescott
Control systems, embedded software and circuit design
I'm looking for work!  See my website if you're interested
http://www.wescottdesign.com
0
Tim
12/8/2016 5:22:14 PM
On Thursday, December 8, 2016 at 11:22:22 AM UTC-6, Tim Wescott wrote:

> Given who you are you must be aware that in an N-dimensional space you=20
> can make a set of N mutually orthogonal vectors.  So if three orthogonal=
=20
> vectors is magic because there's three spatial dimensions, but you're=20
> working with N-dimensional vectors, there may be a mismatch between your=
=20
> problem and whatever solution you're thinking of.

No, it has nothing to do with spatial dimensions. I just need three mutuall=
y orthogonal vectors; I already have two of them. The third must meet some =
other criteria, but I figured that I would create several of them and choos=
e the one that came closest to satisfying those other criteria. It's an eng=
ineer's solution to a math problem -- doesn't need to be perfect, just arbi=
trarily close to it.

Thanks,
Greg
0
Greg
12/8/2016 7:27:27 PM
On 2016-12-08 17:22:14 +0000, Tim Wescott said:

> On Thu, 08 Dec 2016 11:55:01 -0400, Gordon Sande wrote:
> 
>> On 2016-12-08 13:47:24 +0000, Greg Berchin said:
>> 
>>> I have a pair of N-dimensional vectors, N large (hundreds), and I need
>>> to generate another N-dimensional vector that is orthogonal to the
>>> first two. It doesn't have to be unique; it doesn't even have to be
>>> perfectly orthogonal, as long as the normalized dot products are less
>>> than, oh, about 0.005.
>>> 
>>> I've searched for "generalized cross product" and similar terms, and
>>> all I found was a lot of statements that the cross product only exists
>>> for N=3 and N=7, philosophical discussions of what the cross product
>>> really means, etc., but nowhere did I find a simple way to find an
>>> N-dimensional vector approximately orthogonal to two other
>>> N-dimensional vectors.
>>> 
>>> Help and guidance requested and appreciated.
>>> 
>>> Greg
>> 
>> Pick a random vector. Project out the two vectors you have. The result
>> will be orthogonal (except for bad luck when it is zero).
>> 
>> It is called Gram-Schmidt othogonalization.
> 
> I think you can do something similar with QR decomposition (actually, I
> think QR may have Gram-Schmidt somewhere inside).
> 
> Given who you are you must be aware that in an N-dimensional space you
> can make a set of N mutually orthogonal vectors.  So if three orthogonal
> vectors is magic because there's three spatial dimensions, but you're
> working with N-dimensional vectors, there may be a mismatch between your
> problem and whatever solution you're thinking of.

There are several standard methods of achieving a QR decomposition. They differ
on how easy it is to recover Q, rather than a factored version of 
it.The methods
carry names like Givens, Householder and even Gram-Schmidt. If you do 
G-S the way
classical math texts describe it you will find that it has numerical 
problems. There
is a minor rearrangement that cures the issue and goes by the name of 
(wait for it)
Modified Gram-Schmidt.



0
Gordon
12/8/2016 9:14:42 PM
> I have a pair of N-dimensional vectors, N large (hundreds), and I need
> to generate another N-dimensional vector that is orthogonal to the
> first two. It doesn't have to be unique; it doesn't even have to be
> perfectly orthogonal, as long as the normalized dot products are less
> than, oh, about 0.005.

Just look at the first three indices. If those two 3-vectors are nonparallel and nonzero, take their cross product. That gives you the first three entries of your vector, and you can fill the rest with zeros.

If the cross product is zero, pick another set of three indices and try again until it works.

-Ethan
 
0
ethan
12/8/2016 10:55:53 PM
On Thursday, December 8, 2016 at 8:47:27 AM UTC-5, Greg Berchin wrote:
> I have a pair of N-dimensional vectors, N large (hundreds), and I need to=
 generate another N-dimensional vector that is orthogonal to the first two.=
 It doesn't have to be unique; it doesn't even have to be perfectly orthogo=
nal, as long as the normalized dot products are less than, oh, about 0.005.
>=20
> I've searched for "generalized cross product" and similar terms, and all =
I found was a lot of statements that the cross product only exists for N=3D=
3 and N=3D7, philosophical discussions of what the cross product really mea=
ns, etc., but nowhere did I find a simple way to find an N-dimensional vect=
or approximately orthogonal to two other N-dimensional vectors.
>=20
> Help and guidance requested and appreciated.
>=20
> Greg

Here's an extremely simple but powerful way to generate an "N"-dimensional =
vector orthogonal not only to two but to the whole bunch.

You've seen how a 3-D vector cross product (A x B) is expressible as the de=
terminant of a matrix with
 * a top row made up of unit vectors denoted I J K
 * the second row containing the scalar components of A
 * the third row containing the scalar components of B

Just follow that pattern, now with the top row having "N" orthogonal unit v=
ectors above "N-1" rows containing the scalar components of the "N"-dimensi=
onal vectors.
Ref: Integrated Aircraft Navigation, top of page 325
0
navaide
12/9/2016 12:28:58 AM
On Thursday, December 8, 2016 at 7:29:01 PM UTC-5, navaide wrote:
> On Thursday, December 8, 2016 at 8:47:27 AM UTC-5, Greg Berchin wrote:
> > I have a pair of N-dimensional vectors, N large (hundreds), and I need =
to generate another N-dimensional vector that is orthogonal to the first tw=
o. It doesn't have to be unique; it doesn't even have to be perfectly ortho=
gonal, as long as the normalized dot products are less than, oh, about 0.00=
5.
> >=20
> > I've searched for "generalized cross product" and similar terms, and al=
l I found was a lot of statements that the cross product only exists for N=
=3D3 and N=3D7, philosophical discussions of what the cross product really =
means, etc., but nowhere did I find a simple way to find an N-dimensional v=
ector approximately orthogonal to two other N-dimensional vectors.
> >=20
> > Help and guidance requested and appreciated.
> >=20
> > Greg
>=20
> Here's an extremely simple but powerful way to generate an "N"-dimensiona=
l vector orthogonal not only to two but to the whole bunch.
>=20
> You've seen how a 3-D vector cross product (A x B) is expressible as the =
determinant of a matrix with
>  * a top row made up of unit vectors denoted I J K
>  * the second row containing the scalar components of A
>  * the third row containing the scalar components of B
>=20
> Just follow that pattern, now with the top row having "N" orthogonal unit=
 vectors above "N-1" rows containing the scalar components of the "N"-dimen=
sional vectors.
> Ref: Integrated Aircraft Navigation, top of page 325

this determinate thing was the method i was going to guess at, Greg.  i am =
not sure that the Gram-Schmidt orthogonalization process will give you the =
cross product unless you scale it to be the correct length.

(BTW, Greg, i got to meet Jay Frigoletto last weekend.  damn good pianist.)

L8r

r b-j

0
robert
12/9/2016 8:57:43 AM
On Thu, 8 Dec 2016 16:28:58 -0800 (PST), navaide <navaide@gmail.com>
wrote:

>Here's an extremely simple but powerful way to generate an "N"-dimensional vector orthogonal not only to two but to the whole bunch.

Yes, that looks like the wedge product. I saw several references to it,
but they all came with warnings that the vector derived in this manner
was in a different vector space than the vectors from which it was
derived. I haven't had enough time to dig into it deeply enough to grok
the implications.

Thanks,
Greg
0
Greg
12/9/2016 12:56:08 PM
On Fri, 09 Dec 2016 06:56:08 -0600, Greg Berchin wrote:

> On Thu, 8 Dec 2016 16:28:58 -0800 (PST), navaide <navaide@gmail.com>
> wrote:
> 
>>Here's an extremely simple but powerful way to generate an
>>"N"-dimensional vector orthogonal not only to two but to the whole
>>bunch.
> 
> Yes, that looks like the wedge product. I saw several references to it,
> but they all came with warnings that the vector derived in this manner
> was in a different vector space than the vectors from which it was
> derived. I haven't had enough time to dig into it deeply enough to grok
> the implications.
> 
> Thanks,
> Greg

"In a different vector space from" and "orthogonal to" are, I'm 99.44% 
sure, the same statement.

If you're seeking a third vector that's orthogonal to the two you have, 
with no particular care about the N-3 dimensions that you're not 
addressing*, and if N is in the 100's, then I think this determinant-
finding approach is going to be much more computationally intensive than 
other approaches.

You haven't mentioned if your two starting vectors are orthogonal -- are 
they?  Assuming they aren't:

Start with V1 and V2.  You want to find V3 such that V1 dot V3 = 0 and V2 
dot V3 = 0.

Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2.  V2a is (if I'm getting my 
math right) orthogonal to V1.  If it's zero, bomb out or choose another 
V2.

Arbitrarily choose a V3a.

Find 
V3 =  V3a - V1 * (V1 dot V3a) / norm(V1)^2 - 
      V2a * (V2a dot V3a) / norm(V2a)^2.

If V3 is zero, bomb out or choose another V3a.  Otherwise, you're done.

* which just seems weird to me, and was why I was questioning your 
approach earlier.  Sorry, but it just _feels_ like you're missing 
something -- or maybe N-3 somethings.

-- 
www.wescottdesign.com
0
Tim
12/9/2016 5:37:06 PM
On Thu, 08 Dec 2016 14:55:53 -0800, ethan wrote:

>> I have a pair of N-dimensional vectors, N large (hundreds), and I need
>> to generate another N-dimensional vector that is orthogonal to the
>> first two. It doesn't have to be unique; it doesn't even have to be
>> perfectly orthogonal, as long as the normalized dot products are less
>> than, oh, about 0.005.
> 
> Just look at the first three indices. If those two 3-vectors are
> nonparallel and nonzero, take their cross product. That gives you the
> first three entries of your vector, and you can fill the rest with
> zeros.
> 
> If the cross product is zero, pick another set of three indices and try
> again until it works.

That could get you a really sucky vector, if the three you pick are 
mostly parallel.

-- 

Tim Wescott
Wescott Design Services
http://www.wescottdesign.com

I'm looking for work -- see my website!
0
Tim
12/9/2016 8:08:45 PM
On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott
<tim@seemywebsite.really> wrote:

>If you're seeking a third vector that's orthogonal to the two you have, 
>with no particular care about the N-3 dimensions that you're not 
>addressing*, and if N is in the 100's, then I think this determinant-
>finding approach is going to be much more computationally intensive than 
>other approaches.

My hunch was the same about this.

>You haven't mentioned if your two starting vectors are orthogonal -- are 
>they?  Assuming they aren't:

They weren't designed to be orthogonal, but by coincidence their
normalized dot product is <0.0004. That's close enough to orthogonal for
my purposes.

>Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2.  V2a is (if I'm getting my 
>math right) orthogonal to V1.  

Yes, now that I have re-reviewed my linear algebra, this is a standard
orthogonal projection technique. Even though my current V2 is very
nearly orthogonal to V1, I will probably do this to make that last
little adjustment.

>Arbitrarily choose a V3a.
>
>Find 
>V3 =  V3a - V1 * (V1 dot V3a) / norm(V1)^2 - 
>      V2a * (V2a dot V3a) / norm(V2a)^2.

Right. My intent is to compute several of them, and then examine them
for other characteristics that I have not listed here.

>* which just seems weird to me, and was why I was questioning your 
>approach earlier.  Sorry, but it just _feels_ like you're missing 
>something -- or maybe N-3 somethings.

I have three communications channels, and I need orthogonal pilot
symbols because there is significant crosstalk.

Thanks again,
Greg
0
Greg
12/9/2016 9:29:35 PM
On Fri, 09 Dec 2016 15:29:35 -0600, Greg Berchin
<gjberchin@chatter.net.invalid> wrote:

>On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott
><tim@seemywebsite.really> wrote:
>
>>If you're seeking a third vector that's orthogonal to the two you have, 
>>with no particular care about the N-3 dimensions that you're not 
>>addressing*, and if N is in the 100's, then I think this determinant-
>>finding approach is going to be much more computationally intensive than 
>>other approaches.
>
>My hunch was the same about this.
>
>>You haven't mentioned if your two starting vectors are orthogonal -- are 
>>they?  Assuming they aren't:
>
>They weren't designed to be orthogonal, but by coincidence their
>normalized dot product is <0.0004. That's close enough to orthogonal for
>my purposes.
>
>>Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2.  V2a is (if I'm getting my 
>>math right) orthogonal to V1.  
>
>Yes, now that I have re-reviewed my linear algebra, this is a standard
>orthogonal projection technique. Even though my current V2 is very
>nearly orthogonal to V1, I will probably do this to make that last
>little adjustment.
>
>>Arbitrarily choose a V3a.
>>
>>Find 
>>V3 =  V3a - V1 * (V1 dot V3a) / norm(V1)^2 - 
>>      V2a * (V2a dot V3a) / norm(V2a)^2.
>
>Right. My intent is to compute several of them, and then examine them
>for other characteristics that I have not listed here.
>
>>* which just seems weird to me, and was why I was questioning your 
>>approach earlier.  Sorry, but it just _feels_ like you're missing 
>>something -- or maybe N-3 somethings.
>
>I have three communications channels, and I need orthogonal pilot
>symbols because there is significant crosstalk.

Oh, now I think I see what you're doing.   FWIW, I usually do such
things by exhaustive search or similar processes, but with several
hundred dimensions that might get difficult.   The benefit of an
exhaustive (or sparsely "exhaustive") search is that you can
catalog/classify a number of mutually "orthogonal" vectors of varying
quality or sufficiency, especially if they also need other
characteristics like autocorrelation behavior, etc.   I've not had to
do it with very long vectors like that before, though.

I had to dig a little to see that Gram-Schmidt orthonormalization and
orthogonalization are essentially the same thing.   I had heard of it
in the context of "orthonormalization" before, but expected it was, at
worst, nearly the same thing.


0
eric
12/9/2016 10:47:06 PM
> That could get you a really sucky vector, if the three you pick are=20
> mostly parallel.=20

Hence the "try again until it works" step.

To spell out a simple algorithm for picking three workable indices: look fo=
r an index where either V1 or V2 is nonzero. That's your first index. For t=
he second index, you just need to make sure that the 2-vectors you get by l=
ooking at these two indices are not parallel. For the third index you can p=
ick any one you want. Voila, nonparallel 3-vectors.

Given the stated requirements, I think it would be overkill to do any compu=
tation involving every element of the vectors.

-Ethan
0
ethan
12/9/2016 10:54:15 PM
On 12/09/16 16:29, Greg Berchin wrote:
> On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott
> <tim@seemywebsite.really> wrote:
>
>> If you're seeking a third vector that's orthogonal to the two you have,
>> with no particular care about the N-3 dimensions that you're not
>> addressing*, and if N is in the 100's, then I think this determinant-
>> finding approach is going to be much more computationally intensive than
>> other approaches.
>
> My hunch was the same about this.
>
>> You haven't mentioned if your two starting vectors are orthogonal -- are
>> they?  Assuming they aren't:
>
> They weren't designed to be orthogonal, but by coincidence their
> normalized dot product is <0.0004. That's close enough to orthogonal for
> my purposes.
>
>> Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2.  V2a is (if I'm getting my
>> math right) orthogonal to V1.
>
> Yes, now that I have re-reviewed my linear algebra, this is a standard
> orthogonal projection technique. Even though my current V2 is very
> nearly orthogonal to V1, I will probably do this to make that last
> little adjustment.
>
>> Arbitrarily choose a V3a.
>>
>> Find
>> V3 =  V3a - V1 * (V1 dot V3a) / norm(V1)^2 -
>>      V2a * (V2a dot V3a) / norm(V2a)^2.
>
> Right. My intent is to compute several of them, and then examine them
> for other characteristics that I have not listed here.
>
>> * which just seems weird to me, and was why I was questioning your
>> approach earlier.  Sorry, but it just _feels_ like you're missing
>> something -- or maybe N-3 somethings.
>
> I have three communications channels, and I need orthogonal pilot
> symbols because there is significant crosstalk.
>
> Thanks again,
> Greg
>
FWIW, if you make a matrix A whose (two) rows are the two vectors you 
start with, then you are asking for a method of finding an element of 
the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of 
the equation Ax = 0.  Algorithms for finding a basis for the nullspace 
of an arbitrary matrix are standard linear algebra (wikipedia gives one 
using Gaussian elimination on its page for kernels in linear algebra, 
and as other poster have indicated you can also use other matrix 
factorization algorithms to solve this); once you have a basis for the 
nullspace I'd expect it would be easier to look for vectors satisfying 
the additional conditions you alluded to.

Hope this is helpful!

Robert Beaudoin

0
Robert
12/11/2016 4:29:10 AM
On Sat, 10 Dec 2016 23:29:10 -0500, "Robert E. Beaudoin"
<rbeaudoin@comcast.net> wrote:

>FWIW, if you make a matrix A whose (two) rows are the two vectors you 
>start with, then you are asking for a method of finding an element of 
>the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of 
>the equation Ax = 0. 

Yes; once somebody reminded me that this was a linear algebra problem,
and not just specifically a cross-product problem, it all fell into
place.

Thanks to all,
Greg
0
Greg
12/11/2016 12:57:45 PM
On Sunday, December 11, 2016 at 7:57:51 AM UTC-5, Greg Berchin wrote:
> On Sat, 10 Dec 2016 23:29:10 -0500, "Robert E. Beaudoin"
> <rbeaudoin@comcast.net> wrote:
>=20
> >FWIW, if you make a matrix A whose (two) rows are the two vectors you=20
> >start with, then you are asking for a method of finding an element of=20
> >the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of=
=20
> >the equation Ax =3D 0.=20
>=20
> Yes; once somebody reminded me that this was a linear algebra problem,
> and not just specifically a cross-product problem, it all fell into
> place.
>=20
> Thanks to all,
> Greg

With your two vectors you can form a Projection matrix - call it P, - it wi=
ll look very similar to the matrix formula for least squares solution. You =
can then easily find a projection matrix onto the corresponding null space =
(or kernel) via (I-P) , where I is the identity matrix. You can then perfor=
m an SVD - you will then have a set of orthogonal vectors - which are all o=
rthogonal to them selves - so any linear combination of them would also sui=
t your requirements. You can them choose a linear combination of them that =
meets any other requirements you have.=20

Note - in this case the inner product would be zero (or at least to machine=
 and algorithm tolerance). Also this approach may be a little overkill for =
want you actually need.

A good book, that's also very readable, is Gilbert Strang's "Introduction t=
o Linear Algebra"

Hope that helps.
Cheers,
David
0
Dave
12/13/2016 6:11:10 PM
Reply: