StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Time Series Analysis - Case Study Example

Cite this document
Summary
The study "Time Series Analysis" focuses on the critical evaluation of the time series analysis, an array of data points measured at uniform time intervals. Time series analysis is the method which analyzes the time series data to extract meaningful statistics…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER91% of users find it useful
Time Series Analysis
Read Text Preview

Extract of sample "Time Series Analysis"

Time Series Analysis A time series is considered to be an array of data points measured at uniform time intervals. Time series analysis is the methods which analyzes the time series data in order to extract meaningful statistics. It is used as a forecasting model to predict the future values from the previous observed values. (Boashash,2003) The forecasting methods used in time series analysis are briefed below: ARIMA (Auto Regressive Integrated Moving Average) model is a general forecasting model used in time series analysis which canbe stationarized by transformations like differencing and logging. ARIMA models are the fine tuned versions of the random walk and the random trend models. The fine tuning is the addition of the lags of the differenced series and the lags of the forecast errors to the prediction equation as needed to remove the traces of autocorrelation from the predicted errors in forecasting Lags of differenced series are expressed as auto regressive terms and lags of forecast as moving average. The time series which needs to be differenced for making it stationary is called integrated version of the stationary series.(Lin,2003) Decompositional Analysis: This means deconstructing a time series into the notional components. It aims to construct from observed time series, a number of component series where each has a certain feature. Example a quarterly or monthly time series can be broken down into trend components ( reflects long term progression of the series), cyclical component(repeated and periodic fluctuations),seasonal component(seasonal variations) and Irregular component(describes random and irregular influences.(Shumway,1988) Smoothing is the method which is applied to a time series data to produce smoothed data for making forecasts. As time series data is a series of observations hence it can be random or may be orderly but noisy process. In the moving average forecasts the past observation are weighed equally but smoothing assigns exponential decreasing in weights over time.(Dodge,2003) Regression forecasts: This helps to predict the magnitude and estimate of one variable of a time series related to the other variable which is actually assigned a value in the future predictions.(Brown,1993) Data Analysis: Experimentation on time series data Seasonal Analysis The following data represents the number of rainy days at Waikiki beach, Hawaii during the prime tourist season of December and January (62 days). The data was taken over a 20 year period. The (stor) data means the number of stormy days in that period. Simply the univariate statistics were calculated from rainy days and regression was done with stormy days for data analysis. Days Subgroup stor 21 62 4 27   5 19   5 17   3 16   4 19   5 25   6 36   5 23   8 26   5 12   7 16   4 27   4 41   4 18   4 18   6 10   4 22   3 15   3 24   7 Univariate ARIMA Extrapolation Forecast time Y[t] F[t] 95% LB 95% UB p-value (H0: Y[t] = F[t]) P(F[t]>Y[t-1]) P(F[t]>Y[t-s]) P(F[t]>Y[17]) 15 18 - - - - - - - 16 18 - - - - - - - 17 10 - - - - - - - 18 22 0 -45.3997 45.3997 0.1711 0.333 0.2186 0.333 19 15 0 -45.3997 45.3997 0.2586 0.1711 0.333 0.333 20 24 0 -45.3997 45.3997 0.1501 0.2586 0.1711 0.333 Univariate ARIMA Extrapolation Forecast Performance time % S.E. PE MAPE sMAPE Sq.E MSE RMSE ScaledE MASE 18 Inf 1 1 2 484 0 0 2.75 2.75 19 Inf 1 1 2 225 354.5 18.8282 1.875 2.3125 20 Inf 1 1 2 576 428.3333 20.6962 3 2.5417 Decompositional Classical Classical Decomposition by Moving Averages t Observations Fit Trend Seasonal Random 1 21 NA NA -3.04167 NA 2 27 26.5417 23.5 3.04167 0.458333 3 19 17.4583 20.5 -3.04167 1.54167 4 17 20.2917 17.25 3.04167 -3.29167 5 16 13.9583 17 -3.04167 2.04167 6 19 22.7917 19.75 3.04167 -3.79167 7 25 23.2083 26.25 -3.04167 1.79167 8 36 33.0417 30 3.04167 2.95833 9 23 23.9583 27 -3.04167 -0.958333 10 26 24.7917 21.75 3.04167 1.20833 11 12 13.4583 16.5 -3.04167 -1.45833 12 16 20.7917 17.75 3.04167 -4.79167 13 27 24.7083 27.75 -3.04167 2.29167 14 41 34.7917 31.75 3.04167 6.20833 15 18 20.7083 23.75 -3.04167 -2.70833 16 18 19.0417 16 3.04167 -1.04167 17 10 11.9583 15 -3.04167 -1.95833 18 22 20.2917 17.25 3.04167 1.70833 19 15 15.9583 19 -3.04167 -0.958333 20 24 NA NA 3.04167 NA Multiple Linear Regression - Estimated Regression Equation Days[t] = + 19.871 + 0.360215`stor\r`[t] + e[t] Multiple Linear Regression - Ordinary Least Squares Variable Parameter S.D. T-STAT H0: parameter = 0 2-tail p-value 1-tail p-value (Intercept) 19.871 6.30781 3.15 0.00553569 0.00276784 `stor\r` 0.360215 1.26409 0.285 0.778928 0.389464 Multiple Linear Regression - Regression Statistics Multiple R 0.0670146 R-squared 0.00449096 Adjusted R-squared -0.0508151 F-TEST (value) 0.0812019 F-TEST (DF numerator) 1 F-TEST (DF denominator) 18 p-value 0.778928 Multiple Linear Regression - Residual Statistics Residual Standard Deviation 7.70993 Sum Squared Residuals 1069.97 Interpolation Forecasts of Exponential Smoothing t Observed Fitted Residuals 2 27 21 6 3 19 21.0003966417681 -2.00039664176811 4 17 21.0002644016246 -4.00026440162463 5 16 20.9999999563004 -4.99999995630043 6 19 20.9996694214966 -1.99966942149656 7 25 20.9995372294274 4.0004627705726 8 36 20.9998016878652 15.0001983121348 9 23 21.0007933053953 1.99920669460475 10 26 21.0009254668749 4.99907453312505 11 12 21.0012559405019 -9.0012559405019 12 16 21.0006608948233 -5.00066089482333 13 27 21.0003303163268 5.99966968367318 14 41 21.0007269362587 19.9992730637413 15 18 21.0020490274302 -3.00204902743021 16 18 21.0018505710912 -3.00185057109118 17 10 21.0016521278715 -11.0016521278715 18 22 21.0009248420795 0.999075157920533 19 15 21.0009908879023 -6.00099088790232 20 24 21.0005941806296 2.99940581937038 Extrapolation Forecasts of Exponential Smoothing t Forecast 95% Lower Bound 95% Upper Bound 21 21.0007924622342 5.85769225284239 36.143892671626 22 21.0007924622342 5.85769221975373 36.1438927047147 Non Seasonal Analysis The following data represent annual percent change in consumer price index for a sequence of recent years. Average salary was taken for the regression analysis on CPI. Reference: Statistical Abstract Of The United States %CPI sal 1.3 22 1.3 22 1.6 22 2.9 22 3.1 22 4.2 22 5.5 22 5.7 22 4.4 22 3.2 21 6.2 21 11 23 9.1 35 5.8 32 6.5 32 7.6 32 11.3 32 13.5 32 10.3 32 6.2 32 3.2 32 4.3 32 3.6 32 1.9 32 3.6 32 4.1 32 4.8 42 5.4 38 4.2 37 3 32 Univariate ARIMA Extrapolation Forecast time Y[t] F[t] 95% LB 95% UB p-value (H0: Y[t] = F[t]) P(F[t]>Y[t-1]) P(F[t]>Y[t-s]) P(F[t]>Y[27]) 25 3.599999905 - - - - - - - 26 4.099999905 - - - - - - - 27 4.800000191 - - - - - - - 28 5.4 0 -12.2948 12.2948 0.1947 0.2221 0.2567 0.2221 29 4.2 0 -12.2948 12.2948 0.2516 0.1947 0.2221 0.2221 30 3 0 -12.2948 12.2948 0.3162 0.2516 0.1947 0.2221 Univariate ARIMA Extrapolation Forecast Performance time % S.E. PE MAPE sMAPE Sq.E MSE RMSE ScaledE MASE 28 Inf 1 1 2 29.16 0 0 4.5 4.5 29 Inf 1 1 2 17.64 23.4 4.8374 3.5 4 30 Inf 1 1 2 9 18.6 4.3128 2.5 3.5 Classical Decomposition by Moving Averages t Observations Fit Trend Seasonal Random 1 1.299999952 NA NA 0.0724692 NA 2 1.299999952 1.23654 1.4 -0.163457 0.0634568 3 1.600000024 2.02432 1.93333 0.0909877 -0.424321 4 2.900000095 2.6058 2.53333 0.0724692 0.294198 5 3.099999905 3.23654 3.4 -0.163457 -0.136543 6 4.199999809 4.35765 4.26667 0.0909877 -0.157654 7 5.5 5.2058 5.13333 0.0724692 0.294198 8 5.699999809 5.03654 5.2 -0.163457 0.663457 9 4.400000095 4.52432 4.43333 0.0909877 -0.124321 10 3.200000048 4.67247 4.6 0.0724692 -1.47247 11 6.199999809 6.63654 6.8 -0.163457 -0.436543 12 11 8.85765 8.76667 0.0909877 2.14235 13 9.1 8.7058 8.63333 0.0724692 0.394197 14 5.800000191 6.96988 7.13333 -0.163457 -1.16988 15 6.5 6.72432 6.63333 0.0909877 -0.224321 16 7.599999905 8.53914 8.46667 0.0724692 -0.939136 17 11.30000019 10.6365 10.8 -0.163457 0.663457 18 13.5 11.791 11.7 0.0909877 1.70901 19 10.30000019 10.0725 10 0.0724692 0.227531 20 6.199999809 6.40321 6.56667 -0.163457 -0.20321 21 3.200000048 4.65765 4.56667 0.0909877 -1.45765 22 4.300000191 3.77247 3.7 0.0724692 0.527531 23 3.599999905 3.10321 3.26667 -0.163457 0.49679 24 1.899999976 3.12432 3.03333 0.0909877 -1.22432 25 3.599999905 3.27247 3.2 0.0724692 0.327531 26 4.099999905 4.00321 4.16667 -0.163457 0.0967901 27 4.800000191 4.85765 4.76667 0.0909877 -0.0576542 28 5.400000095 4.87247 4.8 0.0724692 0.527531 29 4.199999809 4.03654 4.2 -0.163457 0.163457 30 3 NA NA 0.0909877 NA Estimated Parameters of Exponential Smoothing Parameter Value alpha 0.999933893038648 beta FALSE gamma FALSE Interpolation Forecasts of Exponential Smoothing t Observed Fitted Residuals 2 1.299999952 1.299999952 0 3 1.600000024 1.299999952 0.300000072 4 2.900000095 1.59998019190683 1.30001990309317 5 3.099999905 2.89991415463451 0.20008575036549 6 4.199999809 3.09998667793903 1.10001313106097 7 5.5 4.19992709047446 1.30007290952554 8 5.699999809 5.49991405613042 0.200085752869585 9 4.400000095 5.69998658193887 -1.29998648693887 10 3.200000048 4.40008603315645 -1.20008598515645 11 6.199999809 3.20007938203784 2.99992042696216 12 11 6.19980149337628 4.80019850662372 13 9.1 10.9996826734628 -1.89968267346284 14 5.800000191 9.10012558224908 -3.30012539124908 15 6.5 5.8002183522617 0.699781647738305 16 7.599999905 6.49995373956166 1.10004616543834 17 11.30000019 7.59992718429066 3.70007300570934 18 13.5 11.2997555894168 2.20024441058319 19 10.30000019 13.4998545485278 -3.19985435852778 20 6.199999809 10.3002117226484 -4.10021191364841 21 3.200000048 6.20027086155051 -3.00027081355051 22 4.300000191 3.20019838678672 1.09980180421328 23 3.599999905 4.29992748644463 -0.699927581444634 24 1.899999976 3.60004617508558 -1.70004619908558 25 3.599999905 1.90011236088838 1.69988754411162 26 4.099999905 3.59988753059982 0.500112374400181 27 4.800000191 4.09996684409059 0.700033346909406 28 5.400000095 4.79995391392259 0.60004618107741 29 4.199999809 5.3999604277703 -1.1999606187703 30 3 4.20007913475025 -1.20007913475025 Extrapolation Forecasts of Exponential Smoothing t Forecast 95% Lower Bound 95% Upper Bound 31 3.00007933358498 -1.09236636717234 7.0925250343423 32 3.00007933358498 -2.78732158279578 8.78748024996574 33 3.00007933358498 -4.08793215908365 10.0880908262536 34 3.00007933358498 -5.18440626255851 11.1845649297285 35 3.00007933358498 -6.15042349559102 12.150582162761 36 3.00007933358498 -7.02377220063258 13.0239308678025 37 3.00007933358498 -7.82690072242849 13.8270593895984 38 3.00007933358498 -8.57443554551864 14.5745942126886 39 3.00007933358498 -9.27653633360396 15.2766950007739 40 3.00007933358498 -9.9406003159986 15.9407589831686 41 3.00007933358498 -10.5722118269533 16.5723704941232 42 3.00007933358498 -11.1757093534582 17.1758680206282 New Window | Postscript Multiple Linear Regression - Estimated Regression Equation %CPI[t] = + 1.56871 + 0.129477`sal\r\r`[t] + e[t] Multiple Linear Regression - Ordinary Least Squares Variable Parameter S.D. T-STAT H0: parameter = 0 2-tail p-value 1-tail p-value (Intercept) 1.56871 2.72316 0.5761 0.569176 0.284588 `sal\r\r` 0.129477 0.0926706 1.397 0.173335 0.0866676 Multiple Linear Regression - Regression Statistics Multiple R 0.255292 R-squared 0.0651742 Adjusted R-squared 0.0317875 F-TEST (value) 1.9521 F-TEST (DF numerator) 1 F-TEST (DF denominator) 28 p-value 0.173335 Multiple Linear Regression - Residual Statistics Residual Standard Deviation 3.04457 Sum Squared Residuals 259.544 New Window | Postscript References Brown, Robert Goodell (1963). Smoothing Forecasting and Prediction of Discrete Time Series. Englewood Cliffs, NJ: Prentice-Hall. Boashash, B. (ed.), (2003) Time-Frequency Signal Analysis and Processing: A Comprehensive Reference, Elsevier Science, Oxford, 2003 ISBN Dodge, Y. (2003). The Oxford Dictionary of Statistical Terms. New York: Oxford University Press Lin, Jessica; Keogh, Eamonn; Lonardi, Stefano; Chiu, Bill (2003). "A symbolic representation of time series, with implications for streaming algorithms". Proceedings of the 8th ACM SIGMOD workshop on Research issues in data mining and knowledge discovery. New York: ACM Press Shumway, R. H. (1988). Applied statistical time series analysis. Englewood Cliffs, NJ: Prentice Hall. Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(Time Series Analysis Case Study Example | Topics and Well Written Essays - 1500 words, n.d.)
Time Series Analysis Case Study Example | Topics and Well Written Essays - 1500 words. https://studentshare.org/statistics/1804988-time-series-analysis
(Time Series Analysis Case Study Example | Topics and Well Written Essays - 1500 Words)
Time Series Analysis Case Study Example | Topics and Well Written Essays - 1500 Words. https://studentshare.org/statistics/1804988-time-series-analysis.
“Time Series Analysis Case Study Example | Topics and Well Written Essays - 1500 Words”. https://studentshare.org/statistics/1804988-time-series-analysis.
  • Cited: 0 times

CHECK THESE SAMPLES OF Time Series Analysis

Selected years/seasons in terms of temperature and precipitation

In order to gauge the degree of that relation, a Time Series Analysis (employing standard deviations) was conducted for selected years/seasons.... From the essay "Selected years/seasons in terms of temperature and precipitation " it is clear that anomalies in terms of temperature and precipitation for the stations mentioned earlier are correlated....
7 Pages (1750 words) Essay

Simultaneous Equations

Both these time periods represent a position in time in which the country was facing recessionary forces.... in April 2008, this would imply that the citizens of the United States suddenly lost major confidence in the state of the economy in a short period of time....
2 Pages (500 words) Essay

Why Divorce Rate is Increasing

The Time Series Analysis model for testing whether Coase Theorem appliers to Japanese divorce law or not has been used to analyze the annual data for the period 1964-2006.... In the quantitative methodology, regression analysis has been used to study the impact of socio-economic development, sex ratio, female labor participation rate and religion dominance on the divorce rate....
3 Pages (750 words) Annotated Bibliography

Market Research Methods (Primary vs Secondary Research)

The manager and I prepared an evidence-based training The manual was based on our time-series analysis and primary and secondary research on employee and customer satisfaction.... During my time at Red Lobster, Inc.... As a result, I need to provide time also for my 3 week old baby.... The position offered requires me to travel for two to three hours every day, which I need sorely to provide for my family at the time being.... I believe that I can provide you with a more definite time of my availability in four weeks....
3 Pages (750 words) Essay

Faculty, Quizzes, and a New Learning Management System

In a typical financial empirical studies for example, it is possible to engage in a research with the aim… Forecasting in research is something that has been said to embody the application of highly authentic and valid lines of argument that ensures that the outcomes or forecasts are not merely based on One of the ways in which a researcher performing forecasting for any market variable such as stock market volatility can ensure that the forecasting is accurate and authentic is through the use of Time Series Analysis....
3 Pages (750 words) Research Paper

Time Series Analysis, Forecasting and Control

The next section presents an estimation of the Autocorrelation Function (ACF) and the Partial Autocorrelation… In the third part we present the most appropriate ARIMA time series model for your data on the basis of ACF and PACF plots, statistical tests on estimated coefficients / parameters, information criteria and appropriate experimentation.... has significant spikes at higher lags), we say that the stationarized series displays an "AR signature," meaning that the autocorrelation pattern can be explained more easily by adding AR terms rather than by adding MA terms....
2 Pages (500 words) Essay

Forecasting the Stock

Most of the probability theory of Time Series Analysis is concerned with stationary time series and for this reason, Time Series Analysis requires one to change a non-stationary time series to a stationary Time Series Analysis so as to use it.... To forecast or model such data like the stock and their return, several aspects of the time series must be put into consideration.... A log return of stock one and the return data exhibit trend component and this component can be eliminated by many methods of time series....
4 Pages (1000 words) Assignment

Variability in Oil Sardine and Indian Mackerel Fishery of Southwest Coast of India during 1991-2008

nbsp;… In the statistical approach, particularly in Time Series Analysis, and Auto-Regressive Integrated Moving Average (ARIMA) model is applied.... In the time-series analysis, a statistical prediction model has been validated using ARIMA with the observed landing data.... Trend analysis has been conducted in order to reveal, whether the landings keep the short term as well as a long term trend.... To visualize the structure of the time-series data of annual landings of oil sardine and mackerel for the period 1991 to 2008, the sequence plots of these series have been plotted....
9 Pages (2250 words) Term Paper
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us