Article Versions

Export Article

Cite this article

- Normal Style
- MLA Style
- APA Style
- Chicago Style

Research Article

Open Access Peer-reviewed

Samia Mederessi^{ }, Slaheddine Hallara

Published online: April 23, 2018

Oil markets are more competitive and volatile than ever before. This places the accurate and reliable measurement of market risks in the crucial position for both investment decision and hedging strategy designs**.** This paper attempts to measure risks in the oil market using Value at Risk (VaR) theory. To estimate VaR at higher accuracy and reliability, this paper proposes Wavelet Denoised Value at Risk (WDNVaR) estimates and compared with classical ARMA-GARCH approach. Performances of both approaches have been tested and compared using Kupiec backtesting procedures. Empirical studies of the proposed Wavelet Denoised Value at Risk (WDNVaR) have been conducted on two major oil markets (I.e. WTI & Brent). Experiment results confirm that WDNVaR improves the accuracy and reliability of VaR estimates over traditional ARMA-GARCH approach significantly, which results from its capability to clean up the data and alleviate distortions introduced by outliers.

Crude oil markets can be volatile and risky. The world crude oil prices have risen dramatically during the past decade. However, oil prices did not sustain a constant rise – rather, they showed high volatility, reflecting market conditions such as political turmoil, supply disruptions, unexpected high demand and speculation.

Oil markets have long been the most volatile ones since shocks and the associated risks of losses could prevail in the market due to low inventory level hindered by extremely high storage costs. As the role of market forces increase continuously with the shifts of market from more managed market agreement to the more flexible market based environment, market is getting more volatile and vulnerable to unexpected extreme events ^{ 1}. Thus proper measurement and management of market risks are increasingly valued by investors to protect themselves against adverse market movements.

This paper investigates the risk measurement issue in oil markets. The measurement of risks in oil markets are complicated processes since oil prices receive joint influences from numerous risk factors. To name just a few, these may include economic aspects, weather changes, political aspects, military, natural disasters, market sentiments and speculations, etc ^{ 2, 3}.

Value at Risk (VaR), as the latest development in the risk management field, is adopted in this paper to quantify and measure market risks. VaR estimates the potential loss from market risk across an entire portfolio, using probability concepts by identifying the portfolio containing fundamental risks, allowing an underlying quantifiable and managearable risk factors decomposition of this latter.

Standard VaR estimates take the mathematical form as in equation (1), which means “we are percent certain that we will not lose more than of our investment in the next t days under normal market conditions ^{ 4}:

(1) |

Where,denotes the value of the portfolio at time t and is the confidence level.

Several methods for VaR estimation have mainly been tried through the following three approaches:

The parametric approach, also called variance-covariance approach is more popular than its more complex and sophisticated non-parametric counterpart, the simulation approach. This approach, implemented as either historical simulation or Monte Carlo simulation, is computationally demanding and very costly as well. When the approach is parametric, it is based on the assumption that returns are distributed normally. The parametric approach is flexible, easy to understand and widely accepted ^{ 5}.

However, it relies heavily on the assumption of a normal return distribution. The non parametric approach lets the data speak for it self and extends historical patterns hidden in the data into future. Semi parametric emerges recently to strike the balance between the two extremes, where different techniques borrowed from other disciplines, such as engineering, computer science, applied mathematics, etc., made their way into the field of finance. These may include methods such as Extreme Value Theory, Wavelet Transformation, Fuzzy Logic, etc.

Therefore, this paper proposes wavelet denoising technique to be integral part of VaR estimation process. The original data series are projected into time scale domain. Signals above certain threshold levels are repressed separately. Then VaR is estimated based on better behaved data after denoising process.

Empirical studies based on the proposed Wavelet Denoising VaR (WDNVaR) algorithm & more traditional ARMA-GARCH approach is conducted in both WTI and Brent oil markets. Experiment results are backtested and compared using Kupiec backtesting procedures to evaluate their accuracy and reliability ^{ 6}. Incorporating the flexibility of ex ante hybrid algorithm and details analysis in time scale domain offered by wavelet analysis, WDNVaR is found to improve the reliability of VaR estimates and offer greater flexibility than traditional ARMA-GARCH approach.

The organization of this paper develops as follows: The second section reviews of evaluation methods of model based VaR, its potential applications in economics and finance field and wavelet denoising techniques. The third section proposes the wavelet denoised VaR algorithm (WDNVaR). The fourth section applies the proposed WDNVaR and the traditional ARMA-GARCH approach to measure market risk exposure levels in Oil markets and compares model reliability using Kupiec backtesting procedures. The fifth section concludes.

The main works in this field are the ones of Kupiec ^{ 7}, Christoffersen ^{ 8} and Lopez ^{ 9} who proposed, respectively, a statistical based procedure and a loss function approach to test if the VaR estimates are correct and consistent with the data.

Tow different approaches are actually available to evaluate the VaR estimates: statistical based procedures, and loss functions approaches .The proportion Failure test (or Unconditional coverage test), the time Until First test of Kupiec ^{ 7} and the Conditional coverage test of Christoffersen ^{ 8} and Lopez ^{ 9} belong to the first group belong , while the approach of Lopez ^{ 9} belong to the second one .The main difference between the tow is that with statistical procedure, inference analysis is available. The test of Kupiec and Christoffersen are based on likelihood ratios, and on the assumption that VaR should exhibit a conditional or unconditional coverage equal to p.

Among various hypothesis based backtesting procedure available, the one proposed by Kupiec ^{ 7} is the simplest and the most popular and available. It is based on the simple notion that the model validation process can be treated as a series of Bernoulli trials testing sequences of success and failure. VaR exceedance N in large sample T should converge to the binomial distribution. The likelihood ratio statistics is developed by Kupiec as in equation (2) for testing the specific confidence intervals.

(2) |

Where, denotes the test statistics that has an asymptotic distribution. T is the total number of observations that are used in test set. is the probability of a VaR exceedance occurrence.

The application of wavelet analysis in economic and financial data analysis is only recent phenomenon. Since wavelet analysis projects data signals into time scale domain for analysis, it can be treated as promising multi scale analysis, noise reduction and multi scale modelling tool.

Ramsey ^{ 10} gives an overview of the contribution of wavelets to the analysis of economic and financial data. The ability to represent highly complex structures without knowing the underlying functional form proved to be a great benefit for the analysis of these time-series. In addition, wavelets facilitate the precise location of discontinuities and the isolation of shocks.

Furthermore, the process of smoothing found in the time-scale decomposition facilitates the reduction of noise in the original signal, by first decomposing the signal into the wavelet components, then eliminating all values with a magnitude below a certain threshold and finally reconstructing the original signal with the inverse wavelet transform.

Stevenson ^{ 11}, for example, use wavelet analysis for the filtering of spot electricity prices in the deregulated Australian electricity market. By examining both the demand and price series at different time locations and levels of resolution, Stevenson was able to reveal what was signal and what was noise.

Ramsey and Lampart ^{ 12} use wavelet analysis for time-scale decomposition. They researched both the relationships between consumption and income and money and GDP. The time-scale decomposition yielded a new transformed signal built up from the several wavelet coefficients representing the several scales. At each scale, a regression was made between the two variables.

Chew ^{ 13} research the relationship between money and income, using the same technique of wavelet-based time-scale decomposition as Ramsey and Lampart ^{ 12} did. This research yielded a greater insight in the money versus income nexus in Germany.

Arino ^{ 14} use wavelet-based time-scale decomposition for forecasting applications. The approach used was to apply forecasting methods on each of the resulted coefficients from the time-scale decomposition. After applying forecast methods on each of these coefficients, the final forecast of the complete series was obtained by adding up the individual forecasts.

Aussem and Murtagh ^{ 15} use neural networks to examine the individual coefficients. The trained neural network with its approximated variables in the target function was used for the final forecast. In the area of finance, multi-resolution analysis appears useful, as different traders view the market with different time resolutions, for example hourly, daily, weekly or monthly. The shorter the time-period, the higher the frequency. Different types of traders create the multi-scale dynamics of time-series.

Struzik ^{ 16} applies the wavelet-based effective Holder exponent to examine the correlation level of the Standard & Poor’s index locally at arbitrary positions and resolutions (time and scale).

Norsworty et al. ^{ 17} apply wavelets to analyze the relationship between the return on an asset and the return on the market portfolio, or investment alternative. Similar to other researches in the field of finance and economics, they applied wavelet-based time-scale decomposition to investigate whether there are changes in behavior for different frequencies. The research indicated that the effect of the market return on an individual asset’s return will be greater in the higher frequencies than in the lower.

Denoising, or noise reduction, is a permanent topic for engineers and applied scientists. The problem of denoising is quite varied due to variety of signals and noises. This article considers deterministic signals in zero-mean white noise, as y (n) in (3):

(3) |

Where s (t) is the signal to be estimated, and n (t) is a zero mean white noise with variance. To be exact, the problem is to estimate x (t), or denoise y (t).

It is always tempting to reduce noise after some kind of signal transformation. An appropriate transform can project signal to a domain where the signal energy is concentrated in a small number of coefficients. If noise, on the other hand, is evenly distributed across this domain, this domain will be a very nice place to do denoising, for the Signal- Noise Ratio (SNR) is greatly increased in some important coefficients, or, the signal is highlighted in this domain while the noise is not.

In this sense, for signals composed with a number of sinusoids, it is wise to denoise in frequency domain. Similarly, for piece-wise constant signals or piecewise polynomial signals, it is advantageous to reduce noise in Wavelet transform domain, or time-scale domain, where these signals have a very sparse representation. Since a wide range of signals can be classified into piece-wise polynomial, Wavelet transform has become an essential tool for many applications, especially image processing.

Thresholding noisy information using wavelet denoising techniques requires the selection of the thresholding rule and the selection of threshold level, i.e. the thresholding rule must be chosen to govern model reaction to the noisy signals spotted. Furthermore, the threshold selection rule must be set to determine what type of signals is recognized as noise.

Let W (.) and W^{-1 }denote the forward and inverse wavelet transform operators. Let D denote the thresholding operator with threshold . The practice of thresholding denoising consists of the following three steps:

(4) |

(5) |

(6) |

Hard thresholding and soft thresholding are only different in step 3. The hard-thresholding function chooses all wavelet coefficients that are greater than the given threshold and sets the others to zero.

(7) |

The threshold is chosen according to the signal energy and the noise variance . The soft-thresholding function has a somewhat different rule from the hard-thresholding function. It shrinks the wavelet coefficients by towards zero.

(8) |

The ultimate goal of thresholding is to set the noisy level to the volatility level of the underlying noisy signals. Under that ideal situation, noisy signals are completely wiped out without losing the useful information. However, since the true value of future volatility may never be known for sure, the determination of appropriate thresholding value is a very tricky task. Various threshold selection rules emerge over years including universal thresholding, minimax estimation, Stein's Unbiased Risk Estimate (SURE), etc.

The universal threshold sets threshold level as in (9)

(9) |

Where is the threshold set, N is the number of observations and is the volatility estimator. When N approaches infinity (i.e. infinitely large sample size), universal threshold guarantees the elimination of all noisy signals. But it may also suppress the true signals when threshold value is set at an overly high level ^{ 18}.

Therefore, although universal thresholded signals are visually appealing, it is not particularly popular due to its low goodness of fit ^{ 19}.

Minimax estimation develops as a step forward beyond universal threshold. It shifts the focus from reducing the maximum possible noisy signals to achieving the best function fit. Minimax estimation attempts to attain the minimax value. It minimizes the overall mean square error and retains the maximum level of information possible in the signal. Therefore abrupt changes and spikes usually smoothed or distorted in the universal threshold method are retained in the minimax approach ^{ 19, 20, 21}.

Estimating a signal that is corrupted by additive noise has been of interest to many researchers for practical as well as theoretical reasons. The problem is to recover the original signal from the noisy data. The basic principle of wavelet denoising is to identify and zero out wavelet coefficients of the signal which are likely to contain mostly noise. By identifying and preserving significant coefficients, wavelet thresholding preserves important high pass features of the signal such as discontinuities. This property is useful, for example, in image denoising to maintain the sharpness of edges in the image.

Therefore, it's important for the data to be pre-processed before being modelled using conventional VaR estimation approach.

While Fourier transform could also be argued as the algorithm of choice for de-nosing process, it is only applicable when the composition of the original process can be well approximated by sinusoidal functions. This requires that the underlying processes are both periodic and globally stationary. However, for financial time series, global stationarity and periodicity are very strong assumptions and rarely hold in practice ^{ 18}. Therefore, wavelet analysis is proposed as a more appropriate alternative to capture in the time-frequency domain the non stationarity and non periodicity.

Given wavelet's advantage in locking in details in time frequency domain, it is an extremely useful tool in analyzing and adapting to unknown data characteristics. Typically for financial data, noise level over time and variance corresponding to individual time horizon are unknown. By utilizing the power of wavelet analysis, these data characteristics can be analyzed and manipulated to improve VaR estimation accuracy. After data de-noising process, the disturbing noisy signals (such as common white noise) can be filtered out.

Also the signals with the strongest energy concentration can be singled out. Signals that are too weak or possess the least amount of useful information are simply ignored. Therefore, the signal after de-noising process is expected to comply with investors' interests. It is also more stationary and more suitable for model fitting. Thus, wavelet is expected to help in filtering and cleaning noisy signals and data outliers. This prepares better behaved data. Model fitting process is also less affected by those trivial details and outliers ^{ 18, 22}.

The basic procedure for performing wavelet de-noising is as follows: Firstly transform the original signal using wavelet multiresolution analysis. For all the wavelet coefficients at different scales, set a threshold value and set all the wavelet coefficients that have magnitude less than the threshold value to zero ^{ 22} and ^{ 19, 21}. Then WDNVaR is estimated by applying conventional approaches to the denoised data set.

The basic procedure for performing wavelet de-noising is as follows:

1. By applying wavelet transformation to return series data, the original data are decomposed into sub return series data at different scales J as in (10):

(10) |

Where, Decomposed return series by applying scaling function at scale J and Decomposed return series by applying wavelet function at scale i

2. Wavelet coefficients are denoised

• The chosen threshold setting (The universal threshold sets)

• The chosen removing strategy (hard or soft).

3. Denoised data are reconstructed from denoised wavelet coefficients.

• The conditional mean is aggregated from individual forecasts for both denoised data and noises through ARMA model.

(11) |

• The conditional volatility is aggregated from individual forecasts for both denoised data and noises through GARCH model.

(12) |

This is done based on the preservation of energy property in wavelet analysis.

4. Then VaR based on wavelet denoised approach is as follows:

(13) |

In this section we present the data set, the descriptive statistics, forecast performance results and the interpretations of experiments results.

The crude oil is one of the most important industry inputs and remains the major sources of world’s energy consumption. The price paths of crude oil and its volatilities affect different market movements and the economic status as a whole ^{ 1, 23}.

West Texas Intermediate (WTI) and Brent (or Brent-Forties-Oseberg) crude oil spot prices are used in this study. These markets are considered the world marker crude oil markets. Most other crude oil prices are related to them.

The covered period is: from 20th May, 1988 to 29th December, 2006 (for Brent crude oil) and from 27th March, 1987 to 30th December, 2005 (for WTI crude oil) respectively. 60% of the data set serves as the training set, while the remaining 40% of the data set is used as the test set. One step ahead out of sample forecast is conducted to evaluate the accuracy and reliability of various models under investigation.

During implementation, the training set is continuously expanded to include the newly available observation at each iteration, so that the arrival of new information is taken into consideration. A portfolio of one asset position worth $1 is assumed for each market. The original observations are log differenced (i.e.) for further processing and modelling attempts. Figure 1 displays the return series for WTI and European Brent oil.

The oil markets are characterized by high volatility and the leptokurtic phenomenon (i.e. fat tail and high kurtosis, which signals high probability of extreme events occurrences), which makes adequate risk management and control necessary. This is confirmed by several stylized facts concluded from Table 1:

We remark that these facts suggest a highly competitive and volatile market which makes adequate risk management and control necessary. Firstly, there are significant price fluctuations in the markets as suggested by positive standard deviations. The substantial difference between the minimum and maximum level also indicates considerable losses if risks are not properly measured and managed.

Secondly, we can remark that there is a higher probability of losses in the second and the fourth market as indicated by the negative Skewness.

Thirdly, the high level of excess kurtosis suggests that the markets are volatile, with high probability of extreme events occurrences.

The nonlinear and volatile nature of the oil markets are further confirmed by formal statistical tests conducted. The rejection of Jarque-Bera test of normality suggests that the returns deviate from normal distribution significantly and exhibit leptokurtic behaviors. The rejection of BDS (Brock-Dechert-Scheinkman) test of independence indicates the existence of non-linearity within the data.

For hypothesis testing approach, the null hypothesis suggests that the VaR models exhibit statistical properties that are characteristics of accurate VaR estimate. The test statistics is calculated and compared to critical values corresponding to certain confidence level to decide whether or not to reject the model at that confidence level.

In this paper, traditional hybrid algorithm based on ARMA-GARCH model serves as the benchmark model ^{ 24}. The usual ARMA(r, m)-GARCH (p,q) takes the form as in (14):

• ARMA(r,m):

(14) |

: The autoregressive coefficient

: The moving average coefficient;

: The white noise.

• GARCH(1,1):

: The conditional variance at time t.

: The lag 1 variance.

: The lag 1 squared return in the previous period.

Among various parametric models available, hybrid ARMA model with GARCH error correction is widely accepted in both academic and industries since it’s easy to understand and implement in practice. But its performance is constrained by its sequential linear filtering process as described before.

The GARCH (1, 1) model is used in the experiment since empirical researches suggested that it suffices for most of the situations. The training set is used to estimate parameters by maximizing the maximum likelihood equation.

As suggested by experiment results in Table 2, ARMA(1,1)-GARCH(1,1) performs rather well. It only fails at 95% confidence level in WTI oil market and is accepted under all other circumstances.

The performance of ARMA(1,1)-GARCH(1,1) gradually deteriorates under higher confidence levels for all markets. ARMA(1,1)-GARCH(1,1) provides much better coverage of risks under lower confidence level. This implies that ARMA(1,1)-GARCH(1,1) model may underestimate risk measurement and serve as a generally aggressive risk measures.

The high level of acceptance of ARMA-GARCH VaR supports and confirms the popularity of linear combining power of ARMA and GARCH models during the estimation process. However, increasing competition in the markets pushes operators to work on slight margins, implying that additional accuracy and flexibility have to be pursued.

Several parameters for wavelet de-nosing process are chosen according to investor’s preferences. In this experiment, hard thresholding rule with minimax thersholding is adopted. Wavelet decomposition level is set at level 2 and Daubechies 2 is chosen as the wavelet family.

The return series in the training set is decomposed using the selected wavelet family .Wavelet coefficients at different scales are de-noised according to the chosen thersholding rule and the threshold selection rules. The return series after de-noised process is reconstructed from the de-noised wavelet coefficients. The future daily volatilities are forecasted using estimated ARMA-GARCH model based on the rolling-window method and daily VaR values are forecasted correspondingly.

Experimental results in Table 3 confirm the significant performance improvement gained. WDNVaR is accepted across all markets at 95% confidence levels. During WDNVaR estimation process, ARMA-GARCH model seems to capture more of the data features when applied to wavelet denoised data set. WDNVaR is also less conservative than the traditional ARMA-GARCH VaR.

However, WDNVaR is rejected at higher confidence level 95% across the oil markets in Brent. But given that performance improvement of WDNVaR is based on the wavelet denoising process. The wavelet family and thresholding rule may influence the performance of WDNVaR.

In thus experiment, soft thresholding rule with universal threshold is adopted. Wavelet decomposition level is set at level 3 and Daubechies 2 is chosen as the wavelet family.

Performance improvement brought by WDNVaR is showed in Table 4. Results confirm higher reliability witch prove that the heterogeneous market structure is taken into account using wavelet analysis, data and noises are separated and risk evolution is tracked more closely in a multi scale (time scale) domain.

Oil markets are getting more volatile and risky with newly emerging characteristics therefore novel risk management techniques are desired. This paper introduces Wavelet Denoising VaR analysis as a promising data smoothing tool for risk measurement and management.

The performance of the proposed WDNVaR and the traditional ARMA-GARCH VaR has been evaluated by Kupiec backtesting procedures.

Unique contributions WDNVaR could offer the multi scale denoising process that increases the goodness of fit in the further modeling attempts Risks are embedded in both data and noises, and thus deserve separate modeling

[1] | Plourde., A & Watkins, G C. (1998). Crude Oil Prices Between 1985 and 1994: How Volatile in Relation to Other Commodities?, Resources and Energy Economics, 20, pp. 245-262. | ||

In article | View Article | ||

[2] | Alvarez-Ramirez, J., Soriano, A., Cisneros, M. & Suarez, R. (2003). Symmetry/anti-symmetry phase transitions in crude oil markets, Physica A, 322, pp. 583-596. | ||

In article | View Article | ||

[3] | Wong., S.Y., Yu., L.A, & Lai,. K.K. (2005). A Rough-Set-Refined Text Mining Approach for Crude Oil Market Tendency Forecasting, Int'l J. Knowledge and Systems Sciences,. 2, pp. 33-46. | ||

In article | View Article | ||

[4] | Jorion, P. (2000). Value at Risk: The New Benchmark for Managing Financial Market Risk. McGraw-Hill Trade, N.Y. | ||

In article | |||

[5] | Wiener. (1997). Introduction to value at risk, Risk Management and Regulation in Banking, Jerusalem, Israel. | ||

In article | |||

[6] | He, K.J., Xie, C, Chen, S., & Lai, K.K (2006). Market Risk for Non Ferrous Metals: A Wavelet Based VaR Approach. Intelligent Systems Design and Applications, 2006. ISDA '06. Eighth International Conference on, IEEE, 1, pp 1179-1184. | ||

In article | View Article | ||

[7] | Kupiec, P (1995). Techniques for verifying the accuracy of risk measurement models. J. Derivatives, 3, pp. 73-84. | ||

In article | View Article | ||

[8] | Christoffersen, P.F., (1998). Evaluating interval forecasts. International Economic Review 39, 841-862. | ||

In article | View Article | ||

[9] | Lopez, J.A. (1999). Methods for evaluating value-at-risk estimates. Federal Reserve Bank of San Francisco Review, 2, pp. 3-15. | ||

In article | View Article | ||

[10] | Ramsey, J. (1999). The contribution of wavelets to the analysis of economic and financial data. Philosophical Transactions of the Royal Society of London, 357. | ||

In article | View Article | ||

[11] | Stevenson, M. (2000). Filtering and forecasting spot electricity prices in the increasingly deregulated Australian electricity market, Technical report, School of Finance and Economics, University of Technology Sydney, Sydney. | ||

In article | View Article | ||

[12] | Ramsey, J. & Lampart, C. (1998). Decomposition of economic relationships by time scale using -wavelets: Money and income. Macroeconomic Dynamics, 2, pp. 49-71. | ||

In article | View Article | ||

[13] | Chew, C. (2001). The money and income relationship of european countries by time scale decomposition using wavelets, Preliminary paper, New York University. | ||

In article | View Article | ||

[14] | Arino, A. (1996). Forecasting time series via the discrete wavelet transform. Computing in Economics and Finance. | ||

In article | View Article | ||

[15] | Aussem, A. & Murtagh, F. (1997). Combining neural network forecasts on wavelet-transformed time series. Connection Sci, 9, pp.113-121. | ||

In article | View Article | ||

[16] | Struzik, Z. (2001). Wavelet methods in (financial) time-series. Physica A, 296, pp. 307-319. | ||

In article | View Article | ||

[17] | Norsworty, J., Li, D. & Gorener, R. (2000). Wavelet-based analysis of time series: an export from engineering to finance. IEEE International Engineering Management Society Conference, Albuquerque, New Mexico. | ||

In article | View Article | ||

[18] | Gencay, R., Selcuk, F. & Whitcher, B. (2003). An introduction to Wavelets and Other Filtering Methods in Finance and Economics. Academic Press, San Diego. | ||

In article | View Article | ||

[19] | Donoho, D.L. & Johnstone, I.M (1998). Minimax estimation via wavelet shrinkage, Annals of Statistics, 26, pp 879-921. | ||

In article | View Article | ||

[20] | Donoho, D.L. & Johnstone, I.M. (1994). Ideal spatial adaptation by wavelet shrinkage, Biometrika, 81, pp. 425-455. | ||

In article | View Article | ||

[21] | Donoho, D.L. & Johnstone, I.M (1995). Adapting to unknown smoothness by wavelet shrinkage, J. the American Statistical Assocation,.90,pp. 1200-1224. | ||

In article | View Article | ||

[22] | Ramsey, J. (2002) Wavelets in economics and finance: past and future, Studies in Non-linear Dynamics & Econometrics,6, 3, pp .1-27. | ||

In article | View Article | ||

[23] | Yang, C.Y., Hwang, M.J &. Huang, B.N. (2002). An Analysis of Factors Affecting Price Volatility of the US Oil Market, Energy Economics, 24, pp.107-119. | ||

In article | View Article | ||

[24] | Sadorsky, P. (2002). Time-varying Risk Premiums in Petroleum Futures Prices. Energy Economics, 24, pp. 539-556. | ||

In article | View Article | ||

Published with license by Science and Education Publishing, Copyright © 2018 Samia Mederessi and Slaheddine Hallara

This work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit https://creativecommons.org/licenses/by/4.0/

Samia Mederessi, Slaheddine Hallara. Review on Wavelet Denoised Value at Risk and Application on Crude Oil Market. *International Journal of Global Energy Markets and Finance*. Vol. 1, No. 1, 2018, pp 4-10. https://pubs.sciepub.com/ijgefm/1/1/2

Mederessi, Samia, and Slaheddine Hallara. "Review on Wavelet Denoised Value at Risk and Application on Crude Oil Market." *International Journal of Global Energy Markets and Finance* 1.1 (2018): 4-10.

Mederessi, S. , & Hallara, S. (2018). Review on Wavelet Denoised Value at Risk and Application on Crude Oil Market. *International Journal of Global Energy Markets and Finance*, *1*(1), 4-10.

Mederessi, Samia, and Slaheddine Hallara. "Review on Wavelet Denoised Value at Risk and Application on Crude Oil Market." *International Journal of Global Energy Markets and Finance* 1, no. 1 (2018): 4-10.

Share

[1] | Plourde., A & Watkins, G C. (1998). Crude Oil Prices Between 1985 and 1994: How Volatile in Relation to Other Commodities?, Resources and Energy Economics, 20, pp. 245-262. | ||

In article | View Article | ||

[2] | Alvarez-Ramirez, J., Soriano, A., Cisneros, M. & Suarez, R. (2003). Symmetry/anti-symmetry phase transitions in crude oil markets, Physica A, 322, pp. 583-596. | ||

In article | View Article | ||

[3] | Wong., S.Y., Yu., L.A, & Lai,. K.K. (2005). A Rough-Set-Refined Text Mining Approach for Crude Oil Market Tendency Forecasting, Int'l J. Knowledge and Systems Sciences,. 2, pp. 33-46. | ||

In article | View Article | ||

[4] | Jorion, P. (2000). Value at Risk: The New Benchmark for Managing Financial Market Risk. McGraw-Hill Trade, N.Y. | ||

In article | |||

[5] | Wiener. (1997). Introduction to value at risk, Risk Management and Regulation in Banking, Jerusalem, Israel. | ||

In article | |||

[6] | He, K.J., Xie, C, Chen, S., & Lai, K.K (2006). Market Risk for Non Ferrous Metals: A Wavelet Based VaR Approach. Intelligent Systems Design and Applications, 2006. ISDA '06. Eighth International Conference on, IEEE, 1, pp 1179-1184. | ||

In article | View Article | ||

[7] | Kupiec, P (1995). Techniques for verifying the accuracy of risk measurement models. J. Derivatives, 3, pp. 73-84. | ||

In article | View Article | ||

[8] | Christoffersen, P.F., (1998). Evaluating interval forecasts. International Economic Review 39, 841-862. | ||

In article | View Article | ||

[9] | Lopez, J.A. (1999). Methods for evaluating value-at-risk estimates. Federal Reserve Bank of San Francisco Review, 2, pp. 3-15. | ||

In article | View Article | ||

[10] | Ramsey, J. (1999). The contribution of wavelets to the analysis of economic and financial data. Philosophical Transactions of the Royal Society of London, 357. | ||

In article | View Article | ||

[11] | Stevenson, M. (2000). Filtering and forecasting spot electricity prices in the increasingly deregulated Australian electricity market, Technical report, School of Finance and Economics, University of Technology Sydney, Sydney. | ||

In article | View Article | ||

[12] | Ramsey, J. & Lampart, C. (1998). Decomposition of economic relationships by time scale using -wavelets: Money and income. Macroeconomic Dynamics, 2, pp. 49-71. | ||

In article | View Article | ||

[13] | Chew, C. (2001). The money and income relationship of european countries by time scale decomposition using wavelets, Preliminary paper, New York University. | ||

In article | View Article | ||

[14] | Arino, A. (1996). Forecasting time series via the discrete wavelet transform. Computing in Economics and Finance. | ||

In article | View Article | ||

[15] | Aussem, A. & Murtagh, F. (1997). Combining neural network forecasts on wavelet-transformed time series. Connection Sci, 9, pp.113-121. | ||

In article | View Article | ||

[16] | Struzik, Z. (2001). Wavelet methods in (financial) time-series. Physica A, 296, pp. 307-319. | ||

In article | View Article | ||

[17] | Norsworty, J., Li, D. & Gorener, R. (2000). Wavelet-based analysis of time series: an export from engineering to finance. IEEE International Engineering Management Society Conference, Albuquerque, New Mexico. | ||

In article | View Article | ||

[18] | Gencay, R., Selcuk, F. & Whitcher, B. (2003). An introduction to Wavelets and Other Filtering Methods in Finance and Economics. Academic Press, San Diego. | ||

In article | View Article | ||

[19] | Donoho, D.L. & Johnstone, I.M (1998). Minimax estimation via wavelet shrinkage, Annals of Statistics, 26, pp 879-921. | ||

In article | View Article | ||

[20] | Donoho, D.L. & Johnstone, I.M. (1994). Ideal spatial adaptation by wavelet shrinkage, Biometrika, 81, pp. 425-455. | ||

In article | View Article | ||

[21] | Donoho, D.L. & Johnstone, I.M (1995). Adapting to unknown smoothness by wavelet shrinkage, J. the American Statistical Assocation,.90,pp. 1200-1224. | ||

In article | View Article | ||

[22] | Ramsey, J. (2002) Wavelets in economics and finance: past and future, Studies in Non-linear Dynamics & Econometrics,6, 3, pp .1-27. | ||

In article | View Article | ||

[23] | Yang, C.Y., Hwang, M.J &. Huang, B.N. (2002). An Analysis of Factors Affecting Price Volatility of the US Oil Market, Energy Economics, 24, pp.107-119. | ||

In article | View Article | ||

[24] | Sadorsky, P. (2002). Time-varying Risk Premiums in Petroleum Futures Prices. Energy Economics, 24, pp. 539-556. | ||

In article | View Article | ||