What Is a Risk-Adjusted Return?
A risk-adjusted return is a calculation of the profit or potential profit from an investment that considers the degree of risk that must be accepted to achieve it. The risk is measured in comparison to that of a virtually risk-free investment—usually U.S. Treasuries.
The risk calculation can be shown as a number or a rating, depending on the method used. You can apply risk-adjusted returns to stocks, investment funds, and entire portfolios.
Key Takeaways
- Risk-adjusted return measures an investment's profitability relative to the risk involved, using the risk-free rate as a benchmark.
- The Sharpe ratio and Treynor ratio are common methods for evaluating risk-adjusted returns, each using different risk metrics.
- While the Sharpe ratio focuses on returns per unit of standard deviation, the Treynor ratio uses beta to assess returns per unit of systematic risk.
- Additional risk-adjustment metrics include alpha, beta, and standard deviation, each offering unique insights into an investment's performance.
- Higher risk can lead to better returns, especially in strong markets, though it may also result in greater losses during volatile periods.
How Risk-Adjusted Returns Improve Investment Analysis
The risk-adjusted return measures the profit your investment has made relative to the amount of risk the investment has represented throughout a period. If multiple investments have the same return over a period, the one with the least risk has a better risk-adjusted return. Analysts can use a MAR ratio to compare the performance of trading strategies, hedge funds, and even trading advisors.
Some common risk measures used in investing include alpha, beta, R-squared, standard deviation, and the Sharpe ratio. To compare investments, use the same risk measure for each to get a clear performance perspective.
Important
Different risk measurements give varying results, so be clear about which type of risk-adjusted return you are considering.
Common Risk-Adjusted Return Calculation Methods
Here is a breakdown of the most commonly used measurements.
Sharpe Ratio
The Sharpe ratio measures the profit of an investment that exceeds the risk-free rate per unit of standard deviation. It is calculated by taking the return of the investment, subtracting the risk-free rate, and dividing this result by the investment's standard deviation.
All else equal, a higher Sharpe ratio is better. The risk-free rate used is the yield on very low-risk investment, usually the 10-year Treasury bond (T-bond), for the relevant period.
For example, say Mutual Fund A returned 12% over the past year and had a standard deviation of 10%. Mutual Fund B returned 10% with a standard deviation of 7%, and the risk-free rate over the period was 3%. The Sharpe ratios would be calculated as follows:
- Mutual Fund A: (12% - 3%) / 10% = 0.9
- Mutual Fund B: (10% - 3%) / 7% = 1
Even though Mutual Fund A had a higher return, Mutual Fund B had a higher risk-adjusted return, meaning that it gained more per unit of total risk than Mutual Fund A.
Treynor Ratio
The Treynor ratio is calculated the same way as the Sharpe ratio but uses the investment's beta in the denominator. As with the Sharpe, a higher Treynor ratio is better.
Using the previous fund example and assuming that each of the funds has a beta of 0.75, the calculations are as follows:
- Mutual Fund A: (12% - 3%) / 0.75 = 0.12
- Mutual Fund B: (10% - 3%) / 0.75 = 0.09
Here, Mutual Fund A has a higher Treynor ratio, meaning the fund earns more return per unit of systematic risk than Fund B.
Additional Metrics for Evaluating Investment Risk
Here are some of the other popular risk-adjustments:
- Alpha: An investment's return in relation to the return of a benchmark.
- Beta: An investment's return in relation to the overall market. The market is set at a beta of one—if the investment has a beta of more than one, it fluctuates more than the overall market. If it is less than one, it varies less than the market.
- Standard deviation: The volatility of an investment's returns relative to its average return, with greater standard deviations reflecting wider returns and narrower standard deviations implying more concentrated returns.
- R-squared: The percentage of an investment's performance that can be explained by the performance of an index.
Important Considerations for Risk-Adjusted Returns
Risk avoidance is not always a good thing in investing, so be wary of overreacting to these numbers, especially if the measured timeline is short. In strong markets, a lower-risk mutual fund can limit the performance an investor expects.
A fund taking on more risk than its benchmark might see better returns. High-risk mutual funds may suffer more losses during volatile times, but they often outperform their benchmarks in full market cycles.
What Are the 4 Risk-Adjusted Return Measures?
The Sharpe ratio, alpha, beta, and standard deviation are the most popular ways to measure risk-adjusted returns.
Is Risk-Adjusted Return the Sharpe Ratio?
The Sharpe ratio is one of several ways to measure an asset's risk-adjusted return.
What Is the Risk-Adjusted Return on Real Estate?
The popular measurements can be used to evaluate real-estate risk and returns if you have the information. For the Sharpe ratio, you'd need to know the property's average return and standard deviation. Using the 10-year Treasury rate, you could determine the property's risk-adjusted return.
The Bottom Line
Analysts use risk-adjusted return metrics to evaluate the risk-reward profile of an investment compared to a risk-free benchmark, commonly the 10-year Treasury bond. Key methods like the Sharpe and Treynor ratios, alongside measures such as alpha and beta, provide diverse perspectives on risk.
These tools help investors understand if the potential reward justifies the risk taken. It’s crucial to employ the same risk measure across different investments for a clear comparison of their performances.