Variance is a commonly used risk measure investors use to analyze historical and expected returns. It is the estimated range around an expected return in which the actual return is expected to fall with some degree of confidence and is closely associated with other risk measures, such as standard deviation, volatility and beta. You can calculate variance for any time period, but it is common to annualize measures of risk, because it is standard among investors to disclose and analyze risk and return measures in one-year increments.
Video of the Day
Annualizing weekly variance simply requires multiplying weekly variance by 52, because there are 52 weeks in the year. Annualizing weekly variance this way assumes that weekly variance is a good estimate for the whole year. No growth or loss is factored into the annualization when you multiply weekly variance by 52. For example, weekly variance of 1 percent is multiplied by 52, resulting in annualized variance of 52 percent.
Figures are more statistically significant when they are derived over longer time periods. An annualized figure may not necessarily be more statistically significant, due to any potential inaccuracies in your assumption that weekly variance is a good proxy for annual variance. However, you will find that it is still necessary in order to calculate other important annual risk measures that require annual variance as an input. For example, if you need to estimate the market value of a stock option with a one-year maturity, annual volatility is a critical component of the calculation. Annual variance is used to calculate annual volatility. Therefore, if you only have solid weekly variance figures, you would annualize them for use in the calculation.