Article Versions
Export Article
Cite this article
  • Normal Style
  • MLA Style
  • APA Style
  • Chicago Style
Research Article
Open Access Peer-reviewed

A Wavelet-ARMA-GARCH Refinement Method to VaR Estimate for Foreign Exchange Market

Samia Mederessi , Slaheddine Hallara
International Journal of Business and Risk Management. 2018, 1(1), 28-36. DOI: 10.12691/ijbrm-1-1-4
Published online: April 23, 2018

Abstract

With increasing internationalization of financial transactions, the foreign exchange market has been profoundly transformed and became more competitive and volatile. This places the accurate and reliable measurement of market risks in a crucial position for both investment decision and hedging strategy designs. This paper deals with the measurement of risks from a VaR perspective. A Wavelet-ARMA-GARCH refinement method to VaR estimate is proposed and compared with classical ARMA-GARCH approach. Performances of both approaches have been tested and compared using Kupiec backtesting procedures. Experiment results suggest that the performance of Wavelet-ARMA-GARCH refinement method to VaR estimate improves the reliability of VaR estimates at all confidence levels which offers considerable flexibility and potential performance improvement for Foreign exchange dealers. Furthermore, the appropriate selection and combination of parameters can lead to comprehensive performance improvement in reliability.

1. Introduction

The foreign exchange market plays an indispensable role in providing the essential machinery for making payments across borders, transferring funds and purchasing power from one currency to another, and determining the singularly important price, the exchange rate. Since the early 1970s, with increasing internationalization of financial transactions, the foreign exchange market has been profoundly transformed, not only in size, but in coverage, architecture, and mode of operation. This transformation is the result of structural shifts in economy and in international financial systems.

The foreign exchange business is naturally risky, because it deals primarily in risk-measuring, pricing, accepting when appropriate, and managing it. The success of a bank or other institution trading in the foreign exchange market depends critically on how well it assesses, prices, and manages risk, and on its ability to limit losses from particular transactions and to keep its overall exposure controlled.

Market risk is simply price risk or exposure to price change. Various mechanisms are used to control it, and each institution has its own system. At the most basic trading room level, banks have long maintained clearly established volume or position limits on the maximum open position that each trader or group can carry overnight, with separate-probably less restrictive-intraday or daylight limits on the maximum open position that can be taken during the course of a trading session.

Market participants need a more dynamic way of time evolving assessing market risk, rather than measuring risk on the basis of a snapshot as of one particular moment, or by looking at the estimated amounts of funds involved. Industry members recommended a series of actions to assist in the measurement of market risk.

They recommended that institutions adopt a value at risk (VaR) measure of market risk, a technique that can be applied to foreign exchange and to other products. It is used to assess both the market risk of the foreign exchange position of the trading room, and the broader market risk inherent in the foreign exchange position resulting from the totality of the bank or firm’s activities. VaR is a statistical number describing the potential downside risk over a given holding period at a certain confidence level. Numerous techniques evolve to extract information from data and estimate accurate and reliable VaR number.

In this paper we focus on the wavelet analysis. Although this method is gradually gaining momentum in financial time series forecasting, it has received little attention in the risk management field. There have been some attempts to apply wavelet analysis to VaR estimates, but their focus is on investigating the distribution of potential market losses embedded in VaR numbers across the time horizons. Their approach is based on the assumption that wavelet decomposed variances at different scales represent investors’ preferences. However, they seem to have ignored the impact of different wavelet families chosen for analysis, which leaves their findings largely inconclusive. More comprehensive researches are needed to investigate what wavelet analysis can achieve for VaR estimates and analysis.

Thus in this paper we introduce this approach to VaR estimate for Foreign Exchange market. Experiments using daily time series of CAND/USD, JPY/USD, SZF/USD and SFR/USD exchange rate returns are conducted to statistically evaluate the performance of the Wavelet and the more standard ARMA-GARCH approaches to VaR estimates. Although experimenting is an important step in research, concluding statements can only be made after a thorough process of validation. Moreover three families of wavelets and two decomposition levels are used to investigate the effect of changing of wavelet families and decomposition level on the model’s performance.

Experiment results are backtested and compared using Kupiec Backtesting procedures to evaluate their accuracy and reliability

The rest of the paper is set out as follows: Section 2 proposes the Wavelet-ARMA-GARCH refinement Method or the Wavelet Decomposed VaR (WDVaR) as a specific application of wavelet analysis to VaR estimates. Section presents the empirical analysis: we estimate foreign exchange rate VaR using the ARMA-GARCH standard scheme and the proposed Wavelet methodology. Three families of wavelets are used: the Db4, the Haar and the Sym6. The models are then evaluated and compared. The forth section concludes.

2. Literature Review

2.1. Value at Risk (VaR)

VaR estimates the potential loss from market risk across an entire portfolio, using probability concepts by identifying the portfolio containing fundamental risks, allowing an underlying quantifiable and managearable risk factors decomposition of this latter.

Standard VaR estimates take the mathematical form as in (1), which means “we are X percent certain that we will not lose more than of my investment in the next t days under normal market conditions 1:

(1)

Where denotes the value of the portfolio at time t, is the confidence level.

Several methods for VaR estimation have mainly been tried through the following three approaches:

The parametric approach, also called variance-covariance approach is more popular than its more complex and sophisticated non-parametric counterpart, the simulation approach. This approach, implemented as either historical simulation or Monte Carlo simulation, is computationally demanding and very costly as well. When the approach is parametric, it is based on the assumption that returns are distributed normally. The parametric approach is flexible, easy to understand and widely accepted 2. However, it relies heavily on the assumption of a normal returns distribution. This assumption can be wrong in case when the distribution is “fat-tailed”: the frequency of exceptions occurring is higher than when the distribution is assumed to be normal 3.

The non parametric approach lets the data speak for it self and extends historical patterns hidden in the data into future. Semi parametric emerges recently to strike the balance between the two extremes, where different techniques borrowed from other disciplines, such as engineering, computer science, applied mathematics, etc., made their way into the field of finance. These may include methods such as Extreme Value Theory, Wavelet Transformation, Fuzzy Logic, etc.

2.2. VaR Models Validation: The Kupiec Backtesting Procedure

VaR models are useful in one way only: if they predict the risk well. Model validation is the process of checking whether a model performs adequately, which can be done by a set of tools. One of these tools is backtesting. Backtesting is a tool which verifies whether projected losses are in line with actual losses, in the form of a statistical framework 4. This entails systematically comparing the history of VaR forecasts with the corresponding portfolio returns. For VaR users and risk managers, these checks are essential for examining whether their model is well calibrated. If not, the model needs to be reexamined, in terms of parameters, assumptions and ways of modelling. The theory and corresponding models regarding backtesting, are derived from Jorion 4 and Hull 3.

An observation is a moment where the actual return over an horizon of h days is compared with the forecasted VaR number for this same horizon.

The number of observations exceeding the VaR is also known as the number of exceptions. With too many exceptions, the model underestimates the risk. With too few exceptions, the model is in fact conservative, leading to an inefficient allocation of capital.

As already mentioned, backtesting involves systematically comparing the history of VaR forecasts, with the actual, subsequent returns. With perfectly calibrated model, the number of exceptions should be in line with the confidence level. With a 95% confidence level, one expects 5% exceptions. However, in practice this is not the case: a greater percentage - 6%, 7% or 8% percent can also be found, as a result of bad luck. A smaller amount of course, can be found as well. At some point, there must be a decision made to accept or reject the current model. The decision can be made based on the results of a statistical test.

One way to verify the accuracy of the model is to examine the failure rate of the model. The failure rate is the proportion of times the VaR figure is exceeded in a given sample. As at a given moment, a VaR number is specified given a certain confidence level c for a total of T observations, N is defined as the number of exceptions - i.e. the number of observations where the actual loss exceeds the VaR - and NT is the failure rate 4.

Among various hypothesis based backtesting procedure available, the one proposed by Kupiec in 1995 is the simplest and the most popular and available. It is based on the simple notion that the model validation process can be treated as a series of Bernouilli trials testing sequences of success and failure. VaR exceedance N in large sample T should converge to the binomial distribution. The likelihood ratio statistics is developed by Kupiec as in (2) for testing the specific confidence intervals.

(2)

Where denotes the test statistics that has on asymptotic distribution. T is the total number of observations that are used in test set. is the probability of an VaR exceedance occurrence.

3. Wavelet Analysis and Its Application in Economics and Finance

3.1. Wavelet Analysis Development

Wavelet analysis is a relatively new tool in the field of applied mathematics. Daubechies 5, Chui 6 and Graps 7 provide the fundamentals of the wavelet theory. Wavelet analysis provides the opportunity to make semi-parametric estimations of highly complex structures without knowing the underlying functional form.

Wavelet analysis, in contrast to Fourier analysis, gives insight in local behavior, whereas Fourier analysis gives insight in global behavior. The Fourier transform processes time-series by transforming the signal from the time domain into the frequency domain. The new processed signal provides insight in the amount of frequencies and the amount of energy in each frequency existing in this time-series. However, local effects are only visible in the time domain and not in the frequency domain.

Wavelet analysis makes use of a fully scalable window, which is shifted along the signal in order to capture local behavior in the time domain. This process is repeated several times with different window-sizes, with a collection of time-frequency representations of the signal as a result. The transformation of the signal into the several resulting wavelet coefficients, each providing information at different scales, is more often referred to as time-scale decomposition. However, as there is no direct connection between the Fourier frequency parameter and the Wavelet parameter, the term scale is preserved for wavelet analysis, whereas the term frequency is preserved for Fourier analysis.

Wavelet analysis utilizes the wavelet basis function or commonly referred to as wavelets in the literature for brevity, to transform the original data. Wavelet can be described as a function of time t that exhibits certain appealing properties beyond those offered by ‘big waves’ functions including sins and cosines.

Mathematically wavelets are defined as functions that satisfy admissibility condition as in (3)

(3)

Where is the Fourier transform of wavelet.in the frequency domain.

There are different families of wavelets available. Each of them is capable of adapting to and accentuating certain data characteristics.Typical wavelets include Haar wavelet, Daubechies wavelet, Symlets wavelet, Coiflets wavelet.

Using wavelets functions, we can perform wavelet transform on the signal. The transformation is conducted as in (4).

(4)

Where , and u is the wavelet parameter translating the original wavelet function, s is the scale parameter dilating the original wavelet function.

Wavelets transform analyzes and decomposes the original time series X(t) into series at different scales. Reconstruction of the original return series from decomposed wavelet coefficients could be performed as in (5) accordingly if the admissibility condition is satisfied.

(5)
3.2. Wavelet in Finance and Economics

Ramsey 2 gives an overview of the contribution of wavelets to the analysis of economic and financial data. The ability to represent highly complex structures without knowing the underlying functional form proved to be a great benefit for the analysis of these time-series. In addition, wavelets facilitate the precise location of discontinuities and the isolation of shocks.

Furthermore, the process of smoothing found in the time-scale decomposition facilitates the reduction of noise in the original signal, by first decomposing the signal into the wavelet components, then eliminating all values with a magnitude below a certain threshold and finally reconstructing the original signal with the inverse wavelet transform 9.

Stevenson 10, for example, used wavelet analysis for the filtering of spot electricity prices in the deregulated Australian electricity market. By examining both the demand and price series at different time locations and levels of resolution, Stevenson was able to reveal what was signal and what was noise. Von Sachs and McGibbon researched the contribution of wavelets to non-stationarity and complex functions. They derived the bias and variance for wavelet coefficient estimators under the assumption of local stationarity.

Ramsey and Lampart 11 used wavelet analysis for time-scale decomposition. They researched both the relationships between consumption and income and money and GDP. The time-scale decomposition yielded a new transformed signal built up from the several wavelet coefficients representing the several scales. At each scale, a regression was made between the two variables.

This research yielded three conclusions: First, the relationship between economic variables vary across different scales. Second, the decomposition resolved anomalies from the literature. Third, the research made clear that the slope relating consumption and income declines with scale. In this context, the role of real interest was strong in the consumption-income relation.

Chew 12 researched the relationship between money and income, using the same technique of wavelet-based time-scale decomposition as Ramsey and Lampart 11 did. This research yielded a greater insight in the money versus income nexus in Germany.

Arino 13 use wavelet-based time-scale decomposition for forecasting applications. The approach used was to apply forecasting methods on each of the resulted coefficients from the time-scale decomposition. After applying forecast methods on each of these coefficients, the final forecast of the complete series was obtained by adding up the individual forecasts.

Aussem and Murtagh 14 use neural networks to examine the individual coefficients. The trained neural network with its approximated variables in the target function was used for the final forecast. In the area of finance, multi-resolution analysis appears useful, as different traders view the market with different time resolutions, for example hourly, daily, weekly or monthly. The shorter the time-period, the higher the frequency. Different types of traders create the multi-scale dynamics of time-series.

Struzik 15 applies the wavelet-based effective Holder exponent to examine the correlation level of the Standard & Poor’s index locally at arbitrary positions and resolutions (time and scale).

Norsworty et al. 16 apply wavelets to analyze the relationship between the return on an asset and the return on the market portfolio, or investment alternative. Similar to other researches in the field of finance and economics, they applied wavelet-based time-scale decomposition to investigate whether there are changes in behavior for different frequencies. The research indicated that the effect of the market return on an individual asset’s return will be greater in the higher frequencies than in the lower.

4. Wavelet Decomposed VaR theory

In recent years, more rigid statistical test frameworks and researches have suggested that further performance improvement can not be achieved with the single model approach alone since data exhibit complex behavior that combine the characteristics of heteroscedasticity, leptokurtosis, long memory and even chaos. Thus, as the demand for estimation accuracy moves on to a new level, forecasting community increasingly looks for help from new modeling approaches.

A first one is the linear combination approach, which combines the forecasting powers of different models, is intuitively straightforward and easy to implement. However, it is based on the assumption of individual model capability to strictly separate out data features of interest and the non interference of such models. This assumption is so strong that two issues stand out in practice. Firstly, the fitting and estimation of model parameters is distorted for ill behaved data. Secondly, the first model during the fitting process occupies a privileged position and may destroy data features that the second model is meant to capture.

To tackle the second issue, the nonlinear ensemble approach is introduced. Each individual model is fitted to the data separately, to avoid distortions. Then artificial intelligence algorithms such as neural network and genetic algorithm are employed as combination mechanisms to find out the time varying weights that the data features captured by each individual model contribute to the evolution of the entire series. However, despite its recent popularity in the hybrid modeling community, due to its inherent black box approach, it can offer only limited insights into the underlying driving factors evolution for complicated time series 17.

The previous approaches can all be categorized as ex-post continuous filtering or processing of time series data. The success of the operation of these methods depends on the assumption that each filter is capable of fully extracting the features it was designed to capture. But since noisy and ill-behaved data in practice frequently violate the assumptions of these models, significant bias results during the forecasting process and further performance improvement would critically depend on the accuracy of the individual forecasters. If excessively large biases exist during individual forecasting processes, impact of artificial intelligence techniques used to achieve maximum sample estimation accuracy wouldn’t be adequate, the model would perform poorly out of sample. 18.

Traditionally various models have been attempted to describe the complex risk evolution process. They aim to capture particular data characteristics. This results in the quick deterioration of model’s performance once they are outside the problem domain being investigated.

Although a handful of statistical tests have been utilized to help in identifying the existence of particular data characteristics, these tests usually lack sufficient discriminatory power for noisy data and may not cover all the data features under investigation. Thus, semi-parametric approaches have received considerable attentions recently introducing Wavelet Decomposed techniques as a promising direction for risk estimate 18.

The implementation of the method is laid down as follows:

When the data distribution can be characterized using the first and second moment, the VaR is estimated following (6)

(6)

Where, refers to the corresponding quantile (95th, 97.5th or 99th) of the assumed distribution. refers to the forecast of conditional standard deviation at time t+1 given information at time t. refers to the forecast of sample mean.

The original data returns can be decomposed into different time scales using wavelet analysis as in (2).

The VaR estimated to cover portfolio losses is expected to cover losses at each individual scales as in (7):

(7)

Expanding (6) into (7)

(8)

Therefore, the estimation of VaR boils down to the estimation of conditional mean and conditional volatility which involves three steps:

1. by applying wavelet transformation to return series data, data are decomposed into sub return series data at different scales j as in (**):

(9)

Where f(t) is the original signal. is the decomposed time series data by applying scaling function at scale j. is the decomposed time series data by applying wavelet function at scale j. is in the literature often referred to as the level-j approximation of the original signal, whereas is often referred to as the level-j detail of the original signal.

2. Estimation of conditional mean is made by fitting ARMA model to the training set which is used to estimate the model parameters. The estimated model is then used to make the out-of-sample forecast, one day in advance. The conditional mean is modeled as ARMA (r, m) (r and m is the lag order) process. This is given in (10).

(10)

2. The conditional volatility is modeled as a mixture of GARCH (1, 1) processes at each scale. This is also given in (11).

GARCH (1, 1) is used to fit each individual data series, estimate specific coefficients, and make one step ahead forecasts. Then variances for return series are reconstructed from coefficient variances at the individual level by following one of the special properties of wavelet analysis called preservation of energy, i.e. the variances are preserved across time-scale domain during wavelet decomposition.

5. Empirical Analysis

In this section we present the data set, the descriptive statistics, forecast performance results and the interpretations of experiments results.

5.1. Presentation of Data

The data set used here consists of four daily average quoted rates on American exchange market against the American Dollar: CAD/USD, JPY/USD, SZF/USD and SFR/USD. Our choice of data is justified by the fact that the dollar is by far the most widely traded currency. According to the 1998 survey, the dollar was one of the two currencies involved in an estimated 87 percent of global foreign exchange transactions, equal to about $1.3 trillion a day.

In part, the widespread use of the dollar reflects its substantial international role as: “investment” currency in many capital markets, “reserve” currency held by many central banks, “transaction” currency in many international commodity markets, “invoice” currency in many contracts, and “intervention” currency employed by monetary authorities in market operations to influence their own exchange rates. The other currencies represent 4 countries witch are respectively: Canada, Japan, Switzerland and South Africa. They are almost representing different continents.

It consists of a daily data from January 1971 to December 2002. 60% of the data set serves as the training set, while the 40% is used as the test set. One step ahead out of sample forecast is conducted to evaluate the accuracy and reliability of various models under investigations. The original observations are log differenced (i.e.) for further processing and modeling attempts.

Figure 1, Figure 2, Figure 3 and Figure 4 display the time series of CAD/USD, JPY/USD, SFR/USD and SZF/USD exchange returns from January 1971 to December 1989.

According to these figures, we observe that the average of exchange rates appears constant and no change in the average is observable. We also notice the presence of periodic clusters of the volatility; in particular large variations have the tendency to be followed by high variations of having variable signs and periods of tranquility alternate with periods of elevated volatility. For the CAD/USD, and SZF/USD we notice more steep changes than JPY/USD and SFR/USD. We will perfect these remarks by the survey of the statistical properties of series of exchange rates returns.

5.2. Descriptive Statistics and Hypothesis Testing

Table 1 summarizes the descriptive statistics for the exchanges rates returns along the whole period. We remark that these facts suggest a highly competitive and volatile market which makes adequate risk management and control necessary. Firstly, there are significant price fluctuations in the markets as suggested by positive standard deviations. The substantial difference between the minimum and maximum level also indicates considerable losses, as foreign exchange dealers could face large gains as well as huge losses if risks are not properly measured and managed.

Secondly, we can remark that there is a higher probability of losses in the second and the fourth market as indicated by the negative Skewness.

Thirdly, the high level of excess kurtosis ranging from 3.78416 to 73.74597 suggests that the markets are volatile, with high probability of extreme events occurrences.

The nonlinear and volatile nature of the foreign exchange markets are further confirmed by formal statistical tests conducted. The rejection of Jarque-Bera test of normality suggests that the returns deviate from normal distribution significantly and exhibit leptokurtic behaviors.

5.3. Forecast Performance Results
5.3.1. Backtesting Results for ARMA-GARCH-VaR

As mentioned in this paper for hypothesis testing approach, the null hypothesis I that the VaR models exhibit statistical properties that are characteristics of accurate VaR estimate. The test statistics is calculated and compared to critical values corresponding to certain confidence level to decide whether or not to reject the model at that confidence level.

The order is set to 1 for GARCH model since empirical researches suggest that GARCH (1, 1) suffices for most of the situations. The order is set to 1 for ARMA process. For 1 day horizon, the complete set of daily volatilities is generated by using GARCH (1, 1). VaR values are forecasted on the basis of the estimated volatilities.

As suggested by experiment results in Table 2, ARMA (1,1)-GARCH(1,1) performs rather well and is accepted under all circumstances. The performance of ARMA (1, 1)-GARCH (1, 1) gradually deteriorates under higher confidence levels for all markets. ARMA (1, 1)-GARCH (1, 1) provides much better coverage of risks under lower confidence level. This implies that ARMA (1, 1)-GARCH (1,1) model may underestimate risk measurement and serve as a generally aggressive risk measures.

The high level of acceptance of ARMA-GARCH VaR supports and confirms the popularity of linear combining power of ARMA and GARCH models during the estimation process. However, increasing competition in the markets pushes operators to work on slight margins, implying that additional accuracy and flexibility have to be pursued.


5.3.2. Backtesting results for WDVaR (Haar, 2)

Two new parameters are introduced during estimations. The first is the wavelet family chosen to decompose the original return series. The second is the decomposition level. The decomposition level is set to 2 and Haar wavelet is chosen as the first wavelet family to decompose the original return series.

As shown by experiment results in Table 3, WDVaR (Haar, 2) is now accepted at about all confidence levels for four markets, whereas it failed in the 95% confidence level in SFR/USD. We can see also that WDVaR (Haar, 2) shows inferior performance compared to ARMA-GARCH, i.e. the p values are lower. This finding can’t be surprising but it indicates that ARMA-GARCH VaR is an overly conservative risk measure. In fact the decreasing level of p value by switching from ARMA-GARCH VaR to WDVaR is resulted by improvement in forecasting accuracy.

Moreover, this generally inferior performance of WDVaR (Haar, 2) could be caused by inappropriate setting of parameters, i.e. the wavelet family and decomposition level chosen.


5.3.3. Backtesting results for WDVaR (Db4, 2) and WDVaR (Sym6, 2)

Experiment results in Table 4 and Table 5 show that compared to ARMA- GARCH, WDVaR (SYM6, 2) and WDVaR (Db4, 2) gain performance improvement (the improvement in p value) in three markets (JPY/USD, SFR/USD and SZF/USD) at all confidence levels. Thus WDVaR is less aggressive than ARMA-GARCH approach and provides better coverage of market risks. It offers more flexibility and also greater need for control.

However, the performance of WDVaR for the first foreign exchange rate does deteriorate compared to the performance of ARMA-GARCH model. i.e. The p values are lower. WDVaR (Sym6, 2) and WDVaR (Db4, 2) seem to provide more conservative coverage of market risks.

Moreover, experiments based on WDVaR (Db4, 2) and on WDVaR (Sym6, 2) are conducted at all confidence levels across to investigate the effect of changing of wavelet families on the model’s performance. Experiment results show that they gain performance improvement (the improvement in p value) in all markets at all confidence levels by switching from Haar to Sym6 and DB4 wavelet families. This finding confirms that changing wavelet families improves the model’s performance.


5.3.4. Backtesting Results for WDVaR (Db4, 5)

Experiments based on WDVaR (DB4, 5) are conducted at all confidence levels across all four foreign exchange markets to investigate the effect of changing decomposition level on the model’s performance.

Analysis of experiment results in Table 6 indicates that changing decomposition level from 2 to 5 doesn’t ameliorate model’s performance in term of p-values.

In fact the performance improvement by increasing the decomposition level comes not from higher p-value, but from higher forecasting accuracy.

6. Conclusions

More recently, several ideas have been put forward for a portfolio approach to the value-at-risk approach to market risk. There have been some attempts to apply wavelet analysis to VaR estimates. Their approach is based on the assumption that wavelet decomposed variances at different scales represent investors’ preferences. However, they seem to have ignored the impact of different wavelet families chosen for analysis, which leaves their findings largely inconclusive.

In this paper, experiments using daily time series of CAND/USD, JPY/USD, SFR/USD and SZF/USD exchange rate returns are conducted to statistically evaluate the performance of the Wavelet and the more standard ARMA-GARCH approaches to VaR estimates.

ARMA(1,1)-GARCH(1,1) performs rather well and is accepted under all circumstances but performance of ARMA(1,1)-GARCH(1,1) gradually deteriorates under higher confidence levels for all markets. This implies that ARMA(1,1)-GARCH(1,1) model may underestimate risk measurement and serve as a generally aggressive risk measures.

WDVaR(Haar, 2) shows inferior performance compared to ARMA-GARCH( the p values are lower). This finding indicates that ARMA-GARCH VaR is an overly conservative risk measure. Because the decreasing level of p value by switching from ARMA-GARCH VaR to WDVaR, is caused by improvement in forecasting accuracy.

In addition, the appropriate selection and combination of parameters can lead to comprehensive performance improvement in reliability (as measured by p value). In fact, based on findings from previous experiments, it is argued in this paper that WDVaR has demonstrated its capability to improve the reliability of VaR estimates at all confidence levels which offers considerable flexibility and potential performance improvement for Foreign exchange dealers.

Further researches can be conducted to investigate the effect of multi-fractal VaR models’ performances especially with the specific features observed in our data.

References

[1]  Jorion, P. (2000). Value at Risk: The New Benchmark for Managing Financial Market Risk, McGraw-Hill Trade, N.Y.
In article      
 
[2]  Wiener. (1997). Introduction to value at risk, Risk Management and Regulation in Banking, Jerusalem, Israel
In article      
 
[3]  Hull, J. (2000), Options, Futures and other Derivatives, Fourth Edition, Prentice Hall, inc., Upper Saddle River.
In article      
 
[4]  Jorion, P. (1996). Value at Risk: The New Benchmark for Controlling Market Risk, Irwin, USA.
In article      
 
[5]  Daubechies, I. (1992). Ten lectures on wavelets, CBMS-NSF Regional Conf. Ser. On Applied Mathematics, 61.
In article      View Article
 
[6]  Chui, C. (1992). An introduction to wavelets, Academic Press, San Diego.
In article      View Article
 
[7]  Graps, A. (1995). An introduction to wavelets, IEEE Computational Science and Engineering 2.
In article      View Article
 
[8]  Ramsey, J. (1999). The contribution of wavelets to the analysis of economic and financial data, Philosophical Transactions of the Royal Society of London 357.
In article      View Article
 
[9]  Walker, J. (2000). Wavelets and their scientific applications, Chapmann and Hall/CRC.
In article      
 
[10]  Stevenson, M. (2000). Filtering and forecasting spot electricity prices in the increasingly deregulated Australian
In article      View Article
 
[11]  Ramsey, J., & Lampart, C. (1998). Decomposition of economic relationships by time scale using wavelets. Macroeconomic dynamics, 2(1), 49-71.
In article      View Article
 
[12]  Chew, C. (2001). The money and income relationship of european countries by time scale decomposition using wavelets, Preliminary paper, New York University.
In article      View Article
 
[13]  Arino, A. (1996). Forecasting time series via the discrete wavelet transform, Computing in Economics and Finance.
In article      View Article
 
[14]  Aussem, A. & Murtagh, F. (1997). Combining neural network forecasts on wavelet-transformed timeseries., Connection Sci, 9, 113-121.
In article      View Article
 
[15]  Struzik, Z. (2001). Wavelet methods in (financial) time-series, Physica A 296, pp. 307-319
In article      View Article
 
[16]  Norsworty, J., Li, D. & Gorener, R. (2000). Wavelet-based analysis of time series: an export from engineering to finance, IEEE International Engineering Management Society Conference, Albuquerque, New Mexico.
In article      View Article
 
[17]  In, F. & Kim, S. (2005). The relationship between stock returns and inflation: new evidence from wavelet analysis, Journal of Empirical Finance, 12, pp.435-444.
In article      View Article
 
[18]  Chen, S., He, K.J., Lai, K.K. & Xie, C. (2006). Market Risk Measurement for Crude Oil: A Wavelet Based VaR.
In article      View Article
 

Published with license by Science and Education Publishing, Copyright © 2018 Samia Mederessi and Slaheddine Hallara

Creative CommonsThis work is licensed under a Creative Commons Attribution 4.0 International License. To view a copy of this license, visit https://creativecommons.org/licenses/by/4.0/

Cite this article:

Normal Style
Samia Mederessi, Slaheddine Hallara. A Wavelet-ARMA-GARCH Refinement Method to VaR Estimate for Foreign Exchange Market. International Journal of Business and Risk Management. Vol. 1, No. 1, 2018, pp 28-36. https://pubs.sciepub.com/ijbrm/1/1/4
MLA Style
Mederessi, Samia, and Slaheddine Hallara. "A Wavelet-ARMA-GARCH Refinement Method to VaR Estimate for Foreign Exchange Market." International Journal of Business and Risk Management 1.1 (2018): 28-36.
APA Style
Mederessi, S. , & Hallara, S. (2018). A Wavelet-ARMA-GARCH Refinement Method to VaR Estimate for Foreign Exchange Market. International Journal of Business and Risk Management, 1(1), 28-36.
Chicago Style
Mederessi, Samia, and Slaheddine Hallara. "A Wavelet-ARMA-GARCH Refinement Method to VaR Estimate for Foreign Exchange Market." International Journal of Business and Risk Management 1, no. 1 (2018): 28-36.
Share
[1]  Jorion, P. (2000). Value at Risk: The New Benchmark for Managing Financial Market Risk, McGraw-Hill Trade, N.Y.
In article      
 
[2]  Wiener. (1997). Introduction to value at risk, Risk Management and Regulation in Banking, Jerusalem, Israel
In article      
 
[3]  Hull, J. (2000), Options, Futures and other Derivatives, Fourth Edition, Prentice Hall, inc., Upper Saddle River.
In article      
 
[4]  Jorion, P. (1996). Value at Risk: The New Benchmark for Controlling Market Risk, Irwin, USA.
In article      
 
[5]  Daubechies, I. (1992). Ten lectures on wavelets, CBMS-NSF Regional Conf. Ser. On Applied Mathematics, 61.
In article      View Article
 
[6]  Chui, C. (1992). An introduction to wavelets, Academic Press, San Diego.
In article      View Article
 
[7]  Graps, A. (1995). An introduction to wavelets, IEEE Computational Science and Engineering 2.
In article      View Article
 
[8]  Ramsey, J. (1999). The contribution of wavelets to the analysis of economic and financial data, Philosophical Transactions of the Royal Society of London 357.
In article      View Article
 
[9]  Walker, J. (2000). Wavelets and their scientific applications, Chapmann and Hall/CRC.
In article      
 
[10]  Stevenson, M. (2000). Filtering and forecasting spot electricity prices in the increasingly deregulated Australian
In article      View Article
 
[11]  Ramsey, J., & Lampart, C. (1998). Decomposition of economic relationships by time scale using wavelets. Macroeconomic dynamics, 2(1), 49-71.
In article      View Article
 
[12]  Chew, C. (2001). The money and income relationship of european countries by time scale decomposition using wavelets, Preliminary paper, New York University.
In article      View Article
 
[13]  Arino, A. (1996). Forecasting time series via the discrete wavelet transform, Computing in Economics and Finance.
In article      View Article
 
[14]  Aussem, A. & Murtagh, F. (1997). Combining neural network forecasts on wavelet-transformed timeseries., Connection Sci, 9, 113-121.
In article      View Article
 
[15]  Struzik, Z. (2001). Wavelet methods in (financial) time-series, Physica A 296, pp. 307-319
In article      View Article
 
[16]  Norsworty, J., Li, D. & Gorener, R. (2000). Wavelet-based analysis of time series: an export from engineering to finance, IEEE International Engineering Management Society Conference, Albuquerque, New Mexico.
In article      View Article
 
[17]  In, F. & Kim, S. (2005). The relationship between stock returns and inflation: new evidence from wavelet analysis, Journal of Empirical Finance, 12, pp.435-444.
In article      View Article
 
[18]  Chen, S., He, K.J., Lai, K.K. & Xie, C. (2006). Market Risk Measurement for Crude Oil: A Wavelet Based VaR.
In article      View Article