COMPGROUPS.NET | Search | Post Question | Groups | Stream | About | Register

### gaussian distribution #2

• Follow

```Hi
I need some random numbers from an uniform distribution like gaussian
distribution.
How can I employ this distribution in fortran 90?

```
 0

```On Monday, May 14, 2012 1:31:01 PM UTC-5, Elaheh wrote:
> Hi
> I need some random numbers from an uniform distribution like gaussian
> distribution.
> How can I employ this distribution in fortran 90?
>

Our library, RANDLIB, contains random number generators for many statistical
distributions including the uniform and Gaussian (although not for a
uniform like Gaussian).  It is written in Fortran95 and is available free
from

Barry W Brown
```
 0
Reply brownbar (9) 5/14/2012 6:52:44 PM

```Elaheh <elaheh.adibi@gmail.com> wrote:

> I need some random numbers from an uniform distribution like gaussian
> distribution.
> How can I employ this distribution in fortran 90?

Note that the Gaussian and uniform are different distributions, so it
doesn't make much sense to describe something as "uniform like
Gaussian".

Also note that it makes a *LOT* of difference whether you want a
Gaussian distribution or something "like" it. There are several popular
methods for Gaussian. If you want more general distributions, your
options might be far more restricted.

I see that Barry Brown posted a link to a package of routines, so I
won't go dig one up.

--
Richard Maine                    | Good judgment comes from experience;
email: last name at domain . net | experience comes from bad judgment.
domain: summertriangle           |  -- Mark Twain
```
 0
Reply nospam47 (9742) 5/14/2012 10:58:36 PM

```In article <1kk3df7.1uxr8npnv5kkiN%nospam@see.signature>,
nospam@see.signature (Richard Maine) writes:

> Note that the Gaussian and uniform are different distributions, so it
> doesn't make much sense to describe something as "uniform like
> Gaussian".

It is relatively easy to get a random number from a Gaussian
distribution once one has one from a uniform distribution.

Or, if you have some spare CPU cycles, just build the sum on N random
numbers from a uniform distribution and divide by N.  As N approaches
infinity, the sums so built approach a Gaussian distribution.

Of course, this is not very efficient.  However, it does illustrate WHY
error distributions are often Gaussian (i.e. they are the sums of
random, uncorrelated errors).

```
 0
Reply helbig (4874) 5/15/2012 7:59:20 AM

```On 05/14/2012 08:31 PM, Elaheh wrote:
> I need some random numbers from an uniform distribution like gaussian
> distribution.

I take "like" to mean "as well" here, otherwise it doesn't make sense. I also
assume you are satisfied with non-cryptographically secure, pseudo-random,
otherwise you might want to look into some library like OpenSSL.

> How can I employ this distribution in fortran 90?

real :: u
call random_number(u)

random_number will give you a pseudo-random number under a uniform distribution
regime.

In case you already have two random numbers u and v from a uniform distribution
(0.0,1.0), you might try the algorithm from wikipedia to obtain a
normal-distributed number r.

real, parameter :: pi = 3.1415
r = mean + variance * sqrt(-2.0 * log(u)) * cos(2.0 * pi * v)

Hint: you might want look into the distribution random_number actually gives
you, also pi is probably not given exactly enough in my example.

Regards, Thomas
```
 0
Reply jahns (51) 5/15/2012 8:36:46 AM

```On 14.05.12 20:31, Elaheh wrote:
> Hi
> I need some random numbers from an uniform distribution like gaussian
> distribution.
> How can I employ this distribution in fortran 90?
>

There's always the Box--Muller transform. It turns uniformly distributed
random numbers into normal gaussian distributed ones. (See other
comments on what "uniform distribution" means, too.)

https://en.wikipedia.org/wiki/Box%E2%80%93Muller_transform

Paul
```
 0

```
"Phillip Helbig---undress to reply"  wrote in message
news:jot2co\$vr4\$1@online.de...

In article <1kk3df7.1uxr8npnv5kkiN%nospam@see.signature>,
nospam@see.signature (Richard Maine) writes:

> Note that the Gaussian and uniform are different distributions, so it
> doesn't make much sense to describe something as "uniform like
> Gaussian".

It is relatively easy to get a random number from a Gaussian
distribution once one has one from a uniform distribution.

Or, if you have some spare CPU cycles, just build the sum on N random
numbers from a uniform distribution and divide by N.  As N approaches
infinity, the sums so built approach a Gaussian distribution.

Of course, this is not very efficient.  However, it does illustrate WHY
error distributions are often Gaussian (i.e. they are the sums of
random, uncorrelated errors).

---> It's not that difficult to calculate the exact pdf and cdf for the sum
of two uniforms or three uniforms. For three, it is remarkably close to a
standard normal. But the practical drawback to this method, which was once
proposed and once used, is that the tails will be too light.

-- e

```
 0
Reply epc8 (1259) 5/15/2012 3:57:12 PM

```e p chandler <epc8@juno.com> wrote:

(snip)
> It is relatively easy to get a random number from a Gaussian
> distribution once one has one from a uniform distribution.

> Or, if you have some spare CPU cycles, just build the sum on N random
> numbers from a uniform distribution and divide by N.  As N approaches
> infinity, the sums so built approach a Gaussian distribution.

http://en.wikipedia.org/wiki/Irwin%E2%80%93Hall_distribution

distributions. It is commonly done by adding up 12 uniformly
distributed values and subtracting six, which approximates the usual
normalized Gaussian with variance of 1.0. (The variance is N/12.)

> Of course, this is not very efficient.  However, it does illustrate WHY
> error distributions are often Gaussian (i.e. they are the sums of
> random, uncorrelated errors).

Depending on the speed of COS, SIN, LOG, and SQRT, it could be a
lot faster than Box-Muller.

> ---> It's not that difficult to calculate the exact pdf and cdf for the sum
> of two uniforms or three uniforms. For three, it is remarkably close to a
> standard normal. But the practical drawback to this method, which was once
> proposed and once used, is that the tails will be too light.

Tails are always the problem. While Box-Muller should give pretty
good tails, though they do somewhat depend on how close to uniform
the uniform generator is for smaller values. But to get good tails,
you need a really large number of samples.

-- glen
```
 0
Reply gah (12261) 5/15/2012 7:48:05 PM

```>> Of course, this is not very efficient.  However, it does illustrate WHY
>> error distributions are often Gaussian (i.e. they are the sums of
>> random, uncorrelated errors).
>
> Depending on the speed of COS, SIN, LOG, and SQRT, it could be a
> lot faster than Box-Muller.

Without going into the "speed" debate, I'd like to point out that
Box-Muller can be done without any COS or SIN.
https://en.wikipedia.org/wiki/Box_muller

There's only one SQRT and one LOG evaluation, at most. That should help
performance a bit, too.

Paul
```
 0

8 Replies
194 Views

Similiar Articles:

7/29/2012 6:46:17 PM