**International Journal of Data Envelopment Analysis and *Operations Research***

## On The Continuous Poisson Distribution

**Salah H Abid**^{1,}, **Sajad H Mohammed**^{1}

^{1}Mathematics Department, Education College, Al-Mustansiriya University, Baghdad, Iraq

Abstract | |

1. | Introduction |

2. | Continuous Poisson Distribution |

3. | Parameter Estimation of CPD |

4. | Empirically Selection of Bandwidth for CP Kernel Density Estimation |

5. | Summary and Conclusions |

References |

### Abstract

There are no scientific works deal directly and Extensively with the continuous Poisson distribution (CPD). There are some of rare allusions here and there. In this paper we will take this issue on our responsibility. We consider here the continuous Poisson distribution. Different methods to estimate CPD parameters are studied, Maximum Likelihood estimator, Moments estimator, Percentile estimator, least square estimator and weighted least square estimator. An empirical study is conducted to compare among these methods performances. We also consider the generating issue. Other empirical experiments are conducted to build a model for bandwidth parameter which is used for Poisson density estimation.

**Keywords:** Continuous Poisson distribution, MLE, Percentile estimator, bandwidth selection, density estimation, AR(1) model

**Copyright**© 2016 Science and Education Publishing. All Rights Reserved.

### Cite this article:

- Salah H Abid, Sajad H Mohammed. On The Continuous Poisson Distribution.
*International Journal of Data Envelopment Analysis and *Operations Research**. Vol. 2, No. 1, 2016, pp 7-15. http://pubs.sciepub.com/ijdeaor/2/1/2

- Abid, Salah H, and Sajad H Mohammed. "On The Continuous Poisson Distribution."
*International Journal of Data Envelopment Analysis and *Operations Research**2.1 (2016): 7-15.

- Abid, S. H. , & Mohammed, S. H. (2016). On The Continuous Poisson Distribution.
*International Journal of Data Envelopment Analysis and *Operations Research**,*2*(1), 7-15.

- Abid, Salah H, and Sajad H Mohammed. "On The Continuous Poisson Distribution."
*International Journal of Data Envelopment Analysis and *Operations Research**2, no. 1 (2016): 7-15.

Import into BibTeX | Import into EndNote | Import into RefMan | Import into RefWorks |

### At a glance: Figures

### 1. Introduction

The Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time and/or space, distance, area and volume, if these events occur with a known average rate and independently of the time since the last event.

Because the processes that generate the Poisson distribution are Countless, this distribution is one of the famous widely used distributions in practical applications ^{[8, 12]}.

In various applied research papers, many authors extensively use what they call a continuous Poisson distribution, providing this term with very different, not always correct meanings.

For example, by the "continuous Poisson distribution (CPD)", Some authors use to this end the Gamma distribution, Alissandrakis, Dialetis and Tsiropoula, (1987) ^{[2]} and Herzog et al. (2010) ^{[9]} or even simply the exponential distribution, Webb (2000) ^{[24]} as exact functions of CPD.

From the strictly formal point of view, the above distributions can not be regarded as genuine continuous analogues of the classical Poisson law, since these distributions have probabilistically little in common with this law.

Some other authors, Ilienko, and Podkalyuk, (2010) ^{[10]} and Turowski (2010) ^{[21]} defined an absolutely continuous distribution with the density of the form,

(1) |

where is a normalizing constant. Actually, there is no one determined exactly. This task will be considered as an essential aim of this paper. The CPD properties will surely follow this step.

In this paper we will refer to continuous Poisson distribution by which is mean that the random variable follow continuous Poisson distribution with parameter .

### 2. Continuous Poisson Distribution

By continuous Poisson distribution with parameter , Ilienko in 2013 ^{[10]} defined the probability measure supported by with distribution function of the form,

(2) |

Where , is an upper incomplete gamma function and is the ordinary gamma function.

Now, by using the following facts,

1. , where is the meijer G-function.

2. is the digamma function.

One can derived the probability density function of as follows,

(3) |

For a fixed , Ilienko in 2013 ^{[10]} defined the k-order moment function of the continuous Poisson distribution as follows,

(4) |

Since is a lower incomplete gamma function, then one can develop the above formula in term of the cumulative distribution function of Gamma random variable at with shape parameter and scale parameter 1,

(5) |

Now, since, , it is Possible to simplify (5) to be,

(6) |

It is appropriate to state that for calculation purposes of , we used one of the most powerful Monte Carlo variance reduction techniques to solve the above integral empirically. This technique is the Correlated Sampling.

The reliability function of the continuous Poisson random variable is,

(7) |

And the hazard function is,

(8) |

### 3. Parameter Estimation of CPD

The main aim of this section is to study some of different estimators of the unknown parameter of CPD.

(1) *Maximum Likelihood estimators(***MLE***)**.*

If is a random sample from , then the likelihood function is,

And then the log-likelihood function is,

So, the normal equation become,

After equating the above equation with zero, we using the numerical solution to obtain as ML estimator of .

(2*)**The exact moments estimators (***EME***)**.*

The *method of moments* is a technique for constructing estimators of the parameters that is based on matching the sample moments with the corresponding distribution moments.

Here we provide the method of moments estimator of parameter of a

**CPD** when it is unknown. If follows , then By equation E(x), in equation (6) which represent the population mean with the sample mean and using the numerical solution for the resulting equation, one can get which is the moments estimator of .

(3) *Estimators based on percentiles (***PE***)**.*

Kao in (1959) ^{[11]} originally explored this method by using the graphical approximation to the best linear unbiased estimators. The estimators can be obtained by fitting a straight line to the theoretical points obtained from the distribution function and the sample percentile points. In the case of a CPD, it is possible to use the same concept to obtain the estimator of based on percentiles because of the structure of its distribution function. Since defined in (2), Firstly, we find numerically the value of where , since is the estimate of .

can be obtained by minimizing

For some other choices of , see Mann, Schafer and Singpurwalla (1974) ^{[13]}.

(4) *Least squares (***LSE***) and Weighted least squares (***WLSE***) estimators*

This method was originally suggested by Swain, Venkatraman and Wilson (1988) ^{[20]} to estimate the parameters of Beta distribution. Suppose is a random sample of size from a distribution function and suppose denotes the ordered sample. This method uses the distribution of . For a sample of size , we have ^{[13]},

and

So, in the case of CPD, one can obtain the **LS **estimator of , say by minimizing, with respect to the unknown parameter .

The weighted least squares estimator of say can be obtained by minimizing, with respect to where

**3.1. Generating Continuous Poisson-Distributed Random Variables**

It is well known that if the time intervals between like events are exponentially distributed, the number of events occurring in a unit interval of time has the Discrete Poisson distribution. We can use this relationship between the exponential and Discrete Poisson distributions to generate deviates from the Discrete Poisson distribution. A Discrete Poisson deviate can be defined in the following manner,

Where are random deviates from the exponential distribution having mean and are generated by inverse transform technique , where is from the standard continuous uniform distribution. In summary, the cumulative sums are generated until the inequality holds. When this occurs, is the Discrete Poisson random deviate desired.

By using the above argument, one can generate deviates from the continuous Poisson distribution as,

(9) |

**3.2. The Empirical Study and Discussions**

We conduct extensive simulations to compare the performances of the different methods, stated in section 5, for estimating unknown parameter of CPD, mainly with respect to their mean square errors (**MSE**) for different sample sizes and different parameter values. The experiments are conducted according to run size .

The results are reported in table (1). From the table, we observe that,

1) The MSE's decrease as sample size increases in all methods of estimation. It verifies the asymptotic unbiasedness and consistency of all the estimators.

2) The performances of * ELSE*,

*and*

**LSE***are according to their order.*

**EME**3) For small (n=10) sample size and moderate (n=20) sample size, it is observed that * PE* method works the best whereas the second best method is

*.*

**MLE**4) For large (n=50, 100) sample size, it is observed that * MLE* works the best from all other methods to estimate whereas the second best method is

*.*

**PE**### 4. Empirically Selection of Bandwidth for CP Kernel Density Estimation

Let denote a random sample of size from a random variable with density . The kernel density estimate of at the point is given by^{[18]},

(10) |

where the smoothing parameter *h *is known as the bandwidth and the kernel *K* is generally chosen to be a unimodal probability density symmetric about zero. In this case, *K *satisfies the following conditions ^{[16, 17]},

Actually there are a lot of popular choices of kernel function *K*; uniform, triangular, biweight, triweight, Epanechnikov, normal, and others ^{[18]}. The Epanechnikov kernel is optimal in a mean square error sense ^{[6]}.

The bandwidth controls the smoothness of the density estimate and highly influence its appearance. Selecting a suitable *h* is a pivotal step in estimating *f(x)*. There has been great advancement in recent years in data-based bandwidth selection for kernel density estimation. Some "second generation" methods, including plug-in ^{[16, 22, 23]} and smoothed bootstrap techniques ^{[5, 7, 19]}, have been developed that are far superior to well-known "first generation" methods, such as rules of thumb ^{[17, 23]}, least squares cross-validation ^{[3, 15]}, and biased cross-validation ^{[4, 15]}. A lot of authors recommend a "solve-the-equation" plug-in bandwidth selector as being most trustworthy in terms of comprehensive rendering ^{[17, 18, 19, 21]}. First generation methods for bandwidth selection were mostly proposed before 1990.

**4.1. The Empirical Study and Discussions**

A simulation Experiment was conducted to find a suitable model to represent bandwidth series for Poisson density estimation, by using Matlab software according to the following assumptions,

a. Generate observations from continuous Poisson distribution with . In simulation experiments, it is commonly that the investigator take the parameter (s) values (real) around (up and down) some critical values to get a wide range of conclusions. Here we considered as critical point, so it's estimates surely will be near to 1, a little higher (smaller) than 1. This is without loss of generality, our aim.

b. The sample sizes .

c. The Run size value was .

d. The bandwidth value was calculated according to the closing window algorithm.

e. For reasons of computational efficiency, we use the Epanechnikov kernel, ,

f. To determine , one can use the **window closing method**. This method permits to detect progressively about the shape of the density. The idea beyond the mechanism of "window closing" is to begin with an arbitrary but big and compute the resulting density estimate. We then calculate a further series of density estimates by closing the window, i.e. by gradually decreasing . The incipient estimate based on a large will be a very smooth function, but as we decrease , the estimates will gradually display more detaile and become more "eccentric" in form. By examining the manner in which the estimates change as we decrease , we may be able to detect the point at which the smoothing has been relaxed "too far", and this will then enable us to select a suitable value for bandwidth.

g. we will compute an empirical values of smoothing parameter (bandwidth) , mean , standard deviation , interquartile range , skewness and kurtosis .

The results of simulaion experiment are recorded in Table 2.

Now, we will analyze and discuss the valuable contents of table (2),

1. It is easily to show that the model that represents bandwidth as a function of mean , standard deviation , interquartile range and sample size is,

for practical applications , and can be replaced respectively by sample mean , sample standard deviation , sample interquartile range .

Following some notices and indicators related with the above model,

a. can be calculated from equation (9) as follows,

Then the model will be in terms of parameter also.

b. The model perfectly fits the data by the coefficient of determination, .

c. The ANOVA table is,

It is clear that the model is significant since F calculated is equal to 3119.88 which is more than F tabulated under all common significant levels 0.01, 0.05 and 0.10, where , and .

The mean sruare error is which is mean that the model is quite suitable.

d. The above model is developed one for Silverman’s rule of thumb ^{[17, 18]} which is where . Actually, a lot of computer packages used this rule to determine h, so it is very popular and reliable rule.

2. Among many of time series models, an AR(1) model is the best one to represent the bandwidth , as follows,

Following some notices and indicators related with the above model,

a. We chose the above model as the best one according to Akaike criterion and the behaviour of autocorrelation function and partial autocorrelation function (see Figure 2).

b. Estimated white noise variance = 0.0249264 with 146 degrees of freedom.

c. For diagnostic checking of residuals, portmanteau chi-square test statistic on first 20 residual autocorrelations = 20.9827 with probability of a larger value given white noise = 0.337758. Since 0.337758 > 0.1, 0.05 and 0.01 ( the common significant levels) then the series of residuals over time are random and independent, so it has no model (see Figure 3).

d. Table 3 contains the Pessimistic, optimistic and usual forecasts for bandwidth series. Figure 1 represents these forecasts and the original series of bandwidth.

**Figure**

**1**

**.**bandwidth series and it's Pessimistic, optimistic and usual forecasts

**Figure**

**2.**autocorrelation function (ACF) and partial autocorrelation function (PACF) for bandwidth series

**Figure**

**3.**autocorrelation function (ACF) and partial autocorrelation function (PACF) for residuals series

### 5. Summary and Conclusions

In view of the great importance of the Poisson distribution in statistical analysis, the continuous Poisson distribution (CPD) is considered here. For CPD, Different methods to estimate continuous Poisson distribution parameter are studied, Maximum Likelihood estimator, Moments estimator, Percentile estimator, least square estimator and weighted least square estimator. An empirical study was conducted to compare among these methods. It seemed to us that the Percentile estimator is the best one for small moderate whereas the maximum likelihood estimator is the best to estimate the parameter for moderate and large samples. Other empirical experiments are conducted to build a model for bandwidth parameter which is used for Poisson density estimation. Two models we obtained to represent bandwidth with high quality properties.

### References

[1] | Abramowitz, M. and Stegun, I. (1970). “Handbook of Mathematical Functions: with Formulas, Graphs, and Mathematical Tables”, 9-Revised edition, Dover Publications, USA. | ||

In article | |||

[2] | Alissandrakis, C., Dialets, D. and Tsiropoula, G. (1987) “Determination of the mean lifetime of solar features from photographic observations”, Astronomy and Astrophysics journal , vol. 174, no. 1-2, March, p. 275-280. | ||

In article | |||

[3] | Bowman, A. (1984). “An alternative method of cross-validation for the smoothing of density estimates”; Biometrika, 71; 353-360. | ||

In article | View Article | ||

[4] | Bowman, A. and Azzalini, A. (1997). “Applied Smoothing Techniques for Data Analysis: The Kernel Approach with S-Plus Illustrations”; Oxford Univ. Press. | ||

In article | |||

[5] | Deheuvels, P. (1977). “Estimation nonparamétrique de la densité par histogrammesgeneralizes”; Rev. Statist. Appl. 25 5-42. | ||

In article | |||

[6] | Epanechnikov, V.A. (1969). “Non-parametric estimation of a multivariate probability density”. Theory of Probability and its Applications 14: 153-158. | ||

In article | View Article | ||

[7] | Faraway, J. and Jhun, M. (1990). “Bootstrap choice of bandwidth for density estimation”, J. Amer. Statist. Assoc. 85 (1990) 1119-1122. | ||

In article | View Article | ||

[8] | Haight, F. (1967) “Handbook of the Poisson Distribution”; 1st Edition, Wiley, USA. | ||

In article | |||

[9] | Herzog, A., Binder, T. , Friedl, F., Jahne, B. and Kostina, A. (2010). “Estimating water-sided vertical gas concentration profiles by inverse modeling” ; International Conference on Engineering Optimization, September 6-9, Lisbon, Portugal. | ||

In article | |||

[10] | Ilienko, A. and Klesov, O. (2013). “Continuous counterparts of Poisson and binomial distributions and their properties”; Annals Univ. Sci. Budapest., Sect. Comp. 39 (2013) 137-147. | ||

In article | View Article | ||

[11] | Kao J (1959). A graphical estimation of mixed Weibull parameters in life testing electron tubes., Technometrics, 1, 389-407. | ||

In article | View Article | ||

[12] | Kingman, J. (1993). “Poisson Processes”, 1st Edition, Oxford Studies in Probability (Book 3), Clarendon Press, UK. | ||

In article | PubMed | ||

[13] | Mann N, Schafer R and Singpurwalla N (1974). Methods for Statistical Analysis of Reliability and Life Data., New York, Wiley. | ||

In article | |||

[14] | Scott, D. (1979). “On optimal and data-based histograms”; Biometrika; 66; 605-610. | ||

In article | View Article | ||

[15] | Scott, D. and Terrell, G. (1987). “Biased and unbiased cross-validation in density estimation”; J. Amer. Statist. Assoc.; 82; 1131-1146. | ||

In article | View Article | ||

[16] | Sheather, S. and Jones, M. (1991). “A reliable data-based bandwidth selection method for kernel density estimation”; J. Roy. Statist. Soc. Ser.B 53, 683-690. | ||

In article | |||

[17] | Sheather, S. (2004). “Density Estimation”; Statistical Science, 19(4), pp. 588-597. | ||

In article | View Article | ||

[18] | Silverman, B. (1986). “Density Estimation for Statistics and Data Analysis”; Chapman and Hall, London. | ||

In article | View Article | ||

[19] | Simonoff, J. (1996). “Smoothing Methods in Statistics”; Springer, New York. | ||

In article | View Article | ||

[20] | Swain J, Venkatraman S and Wilson J (1988). Least squares estimation of distribution function in Johnson's translation system, Journal of Statistical Computation and Simulation, 29, 271-297. | ||

In article | View Article | ||

[21] | Turowski, J. (2010). “Probability distributions of bed load transport rates: A new derivation and comparison with field data”; water resources research, Volume 46, Issue 8. | ||

In article | |||

[22] | Venables, W. and Ripley, B. (2002). “Modern Applied Statistics with S”, 4th ed. Springer, New York. | ||

In article | View Article | ||

[23] | Wand, M. and Jones, M. (1995). “Kernel Smoothing”; London: Chapman & Hall/CRC. | ||

In article | View Article | ||

[24] | Webb, G. (2000). “MultiBoosting: A Technique for Combining Boosting and Wagging”, Machine Learning, 40, 159-39. | ||

In article | View Article | ||