The Geeky αlphα



As evident from the results of the Brexit poll and the US elections in 2016, human beings exhibit a highly irrational behavior. One of the best examples of such anomalous behavior is the financial bubble, which is created when people start trading in an asset too much, pushing up its prices way beyond what it is actually worth. This article covers a brief detail about how a flower became more expensive than a house in Amsterdam, what causes a financial bubble and the life cycle of a bubble.

How a flower became more expensive than a house?

Tulipmania, which occurred in the 17th century, is believed to be the world's first ever financial bubble. Within a period of a few months, Tulip bulb prices in the Netherlands skyrocketed to insane heights before nose-diving to worthlessness, leaving several investors bankrupt. At one point, a single Tulip bulb was valued at around 6,000 guilders, roughly the same as the cost of a luxurious mansion in Amsterdam! This 3-minute video perfectly elucidates how it all happened.



What causes a financial bubble?

A bubble gets created when people start trading in an asset at prices that highly exceed its actual (or intrinsic) value. As this trading continues, the price of the asset keeps getting further inflated making it seem highly attractive to potential investors. As more and more investors join the bandwagon, the price starts increasing at an unprecedented rate.


Stage 1: Displacement

All bubbles are formed when something new is perceived to be happening. It might be a new discovery, a breakthrough technology or something else which might have the potential of having immense value in the future. In this stage, smart investors notice that something big is about to happen and start investing in it. Having the first-mover advantage, these smart investors get to ride the price curve once the asset prices start moving upwards. In the 1990s, people had started investing in the internet, in the late 2000s, the newly tweaked Collateralized Debt Obligations (CDO) consisting of subprime loans were selling like hotcakes. In the above case, it was tulip bulbs.

Stage 2: Boom

This is the rapid growth phase. Once the bubble formation picks up, people develop a convincing narrative to justify why the asset is a good investment. This narrative becomes self-reinforcing as more and more people start talking about it. After acquiring critical mass, this narrative keeps gaining momentum and spreads like wildfire. This results in a huge number of people believing in the high future value of the asset. 

Stage 3: Euphoria

In this phase, the asset reaches its peak value. As prices are rising at a very high rate, everyone finds the asset to be a phenomenal investment. Even those who were sitting on the fence are convinced that it is the right time to invest. The number of investors keeps snowballing further pushing up the price of the asset. As more and more people are ready to invest, the asset price reaches new highs shattering all previous records.

Stage 4: Crisis

For each bubble, there is a tipping point when people start realizing that the asset is hugely overpriced. This usually starts with people within the industry who are working very closely with the asset. These industry insiders then start selling off their investments. At some point in time, the news about this insider selling leaks out to the general public. As a consequence, panic selling starts and all investors are trying to get rid of the asset by dumping it back in the market. 

Stage 5: Revulsion

As the panic spreads, the asset gets demonized and looked upon as an evil investment by everyone in the investment world. Even journals that were singing the asset's praises a few months back start writing negative things about it. As shown in the graph below, the asset prices go into a free fall dropping even below its actual intrinsic value. Finally, with time things settle down and the value of the asset is restored to its mean. 

Anatomy of a Typical Bubble (Source: Jean-Paul Rodrique)


Are we going to have a bubble soon?

In June 2018, it will be 10 years since the last major financial meltdown. Based on the past few decades, the usual frequency of occurrence is once in a decade. Fortunately or unfortunately, this decade hasn't had a major bubble yet! Should we expect one soon? If only we could predict human behavior. However, we should always be prepared for one.

To end it on a lighter note, here is a cartoon that very aptly depicts human behavior...
 

Money is a necessary evil. Fortunately or unfortunately, most of it is managed by the biggest banks across the globe. This is the reason why we need an infallible banking system that we can always count on. However, sometimes the financial system becomes too vulnerable to its own imperfections and this often leads to a complete failure of the system. The financial crisis of 2008 is the most recent example of such a debacle. 

Why Basel III regulations were required (Image Courtesy: about.com)

The financial services industry needs a clearly defined set of regulatory guidelines to keep functioning in a smooth manner. These guidelines not only help in establishing trust in the financial system and the intermediaries involved but also make sure that these financial institutions have the ability to pay off their liabilities in times of financial distress.

What are Basel Accords?

Basel accords are sets of regulations (Basel I, II and III) for the Banking sector set by the Basel Committee on Banking Supervision. The purpose of these accords is to improve the worldwide bank regulatory framework.


Basel Committee on Banking Supervision (BCBS)

The Basel Committee on Banking Supervision, established in 1974, provides a forum for regular cooperation on banking supervisory matters. Its objective is to enhance understanding of key supervisory issues and improve the quality of banking supervision worldwide. The Committee's members come from Argentina, Australia, Belgium, Brazil, Canada, China, European Union, France, Germany, Hong Kong SAR, India, Indonesia, Italy, Japan, Korea, Luxembourg, Mexico, the Netherlands, Russia, Saudi Arabia, Singapore, South Africa, Spain, Sweden, Switzerland, Turkey, the United Kingdom and the United States.
BCBS Member Countries

Why did we ever need a Basel Accord?

In the 1980s, the rate of bank failures in the United States was increasing at an appalling rate. This was primarily due to the Savings and Loan (S&L) Crisis and the fact that banks had been lending recklessly. As a result, the external debt of a lot of countries had been growing at an unsustainable rate and the probability of major international banks going belly up was alarmingly high. The banking industry was going through a turmoil and was terribly in need of a framework to bring some order amidst the chaos. To prevent all hell from breaking loose, representatives from central banks and supervisory authorities of 10 countries, known as the Basel Committee on Banking Supervision (BCBS), met in 1987 in Basel, Switzerland to issue guidelines relating to capital and risk management activities of global banking institutions. This was the beginning of the Basel Accords.

Basel I

Basel I is the first in the series of regulations issued by the BCBS and was enacted in 1988 to improve banking stability. It weighed the capital owned by a bank against the credit risk it faced. Basel I defined the bank capital ratio and set the ball rolling for solvency monitoring and reporting. The main highlights of this accord are listed below:
  1. Assets of financial institutions are broadly divided into five risk categories (0%, 10%, 20%, 50% and 100%). 
  2. Banks that operate internationally are required to have a minimum of 8% capital to risk-weighted assets.

Signs of a fragile banking system (Image Courtesy: Washington Post)
Even though Basel I was the first step towards an internationally accepted assessment of risk-weighted assets, it had a few shortcomings:
  • The categorization of credit risk was very generic as the risk was simply assigned to one of the four categories (10%, 20%, 50% and 100%).
  • A static measure of 8% capital ratio did not take into account the changing nature of the default risk of financial institutions.
  • The maturity of credit exposure was not considered and duration of credit instruments was not accounted for.
  • There was no differentiation of counterparty risk for different kinds of borrowers.
  • It did not provide any relaxation for diversification of the portfolio.

Basel II

The Basel II framework, also called the Revised Capital Framework, aimed to build up on the foundation laid down by Basel I. It has three pillars:
  1. Minimum Capital Requirements: This Basel Accord further refined the definition of risk-weighted assets and provided guidelines for calculation of minimum regulatory capital ratios dividing the eligible regulatory capital of a bank into tiers.
  2. Supervisor Review: This pillar laid down guidelines for national regulatory authorities to deal with risks such as systemic risk, liquidity risk and legal risk.
  3. Market Discipline: The last and final pillar requires disclosures by banks regarding their risk exposures, capital adequacy and the overall risk assessment process.
Basel II was much more comprehensive in its risk definition and provided a really good framework based on the three pillars. However, even this was not perfect. Between 1998 and 2008, the volume of credit default swaps being sold in the industry kept growing exponentially and snowballed to roughly $55 trillion, a significant proportion of which was made up of below investment-grade securities. This led to a complete meltdown of the financial system and the disintegration of global behemoths such as Lehman Brothers. The financial crisis of 2008 was a wake-up call for the international financial services industry. It was the perfect illustration of how the entire banking industry can go from boom to bust in just a matter of days.
The subprime crisis in a nutshell (Image Courtesy: Santa Cruz Live)

Basel III

Basel III introduced much tighter capital requirements than Basel I and Basel II to address the weaknesses in the previous accord. One of the most evident problems with Basel II was that it did not moderate the imprudent lending activities of banking institutions. 
Major changes from Basel II:
  1. Minimum Capital Requirements: Although the overall regulatory capital requirement was unaltered at 8%, the Common Equity Tier 1 capital requirement was raised from 4% to 4.5% and minimum Tier 1 capital was raised from 4% to 6%.
  2. Leverage and Liquidity: To make sure that banks have ample liquidity during financial stress and to protect them from disproportionate borrowing, an upper limit of 3% was introduced for the leverage ratio (computed as Tier 1 capital divided by the total of on and off-balance sheet assets less intangible assets).
  3. Countercyclical Measures: To ensure that the banks' regulatory capital was in sync with the cyclical changes in their balance sheets, new guidelines were introduced requiring banks to set aside additional capital in times of credit expansion and relaxing the capital requirements during credit contraction.
  4. Bucketing System: Basel III also established the bucketing system in which banks were grouped together and assigned to buckets according to their size, complexity and importance to the overall economy. Guidelines were defined for identifying and regularly updating a list of Systematically Important Banks and subjecting them to higher capital requirements. 

Looking Ahead: What to expect?

Following Basel III, the banking system has been able to raise billions of dollars in regulatory capital, hire thousands of extra regulatory and compliance personnel and shed off trillions of dollars of risky assets. This has resulted in a lean and efficient global banking machinery. However, although the Basel III guidelines have tackled most of the shortfalls of the previous accord, the banking environment is constantly evolving and new kinds of risks keep emerging every now and then.
Not so long ago, the Banking industry fell in love with the Tech industry and together they gave birth to the FinTech industry. This industry is highly unregulated at the moment and lacks proper supervision. In recent years, slowly but steadily people have been moving their money from traditional brick and mortar banks to these internet-only digital banks. With online banks, cryptocurrencies and the Internet of Things (IoT) coming into the picture, it is becoming harder and harder to decipher which firms qualify as a bank and which ones are just tech firms. Also, with the growing role of technology in banks comes an exponentially higher amount of cyber risk associated with their banking activities. 
Although it is too early to say what Basel IV would look like, it would be great to have some guidelines accounting for cyber risk and encouraging higher disclosure of reserves and other financial statistics.

The year 2016 was a year of anomalies. From Brexit to the US election results, it was a year that fully tested the limits of our wildest imagination. However, amidst all this chaos, there was another anomaly which was observed by much lesser people across the globe. For the first time in the history of money, the year's best-performing currency was a digital currency. The Bitcoin.

The Bitcoin Symbol (Image courtesy: huffingtonpost.com)


Bitcoin and the legend of Satoshi Nakamoto

Before we talk about the Bitcoin's past performance, let me briefly walk you through what the currency is all about.

What is Bitcoin?

Bitcoin is a cryptocurrency (a term coined by its creator, Satoshi Nakamoto), which is a digital currency in which encryption techniques are used to regulate the generation of units of currency and verify the transfer of funds, operating independently of a central bank. 


The bitcoin uses blockchain technology where transactions are verified by network nodes and recorded in a public distributed ledger. Since it is a peer-to-peer technology, transactions can take place directly between the users, without the need for intermediaries such as banks or brokers. In a nutshell, the bitcoin is a decentralised currency designed for the digital realm.

Who is Satoshi Nakamoto?

So far, nobody knows who he is. One of the most intriguing things about Bitcoin is that its creator still remains anonymous. Bitcoin was created by a person who referred to himself on the internet as 'Satoshi Nakamoto'. Nakamoto anonymously published his invention on 31 October 2008, to a mailing list consisting of cryptographers, in a research paper called "Bitcoin: A Peer-to-Peer Electronic Cash System". Nakamoto never revealed any personal information even while discussing the technical details with these cryptographers. Although he claimed to be a man living in Japan and born around 1975, most of the speculators do not believe these claims to be authentic. This is because of two main reasons. Firstly, Nakamoto used perfect English in all his conversations with the cryptographers. Secondly, the documentation and labelling for the software were not done in Japanese.


However, everyone in the Bitcoin world keeps speculating and there are a few people who are suspected to be the real Satoshi Nakamoto (Read: Who is Satoshi Nakamoto?). Unfortunately, it is hard to point out a single person.

When did the first bitcoin transaction take place? 

The very first transaction of bitcoins for US dollars was done in fall of 2009 between Martti Malmi (A.K.A. Sirius) and another user who called himself NewLibertyStandard. In this transaction, Sirius sent NewLibertyStandard a total of 5050 bitcoins in exchange for $5.02 (a super nominal price of less than $0.0001 per bitcoin). They had calculated the value of these bitcoins based on the cost of electricity spent to generate them. Just about seven years later, the same amount of bitcoins were worth $5,085,035 on January 1, 2017.

Past Performance

The below chart shows the overall growth of the currency from 01/01/2016 to 01/01/2017 during which the price of a Bitcoin more than doubled appreciating from $423 to $1,007 (in a brief span of just one year!).


Bitcoin Price Performance in 2016 (Generated on coindesk.com)


Even though the currency has appreciated significantly this year, the ride has been far from smooth since its release. Looking at the overall volatility of the currency, one can infer that Bitcoin is a high-risk high-return currency. With levels very close to its previous all-time high of $1,163 attained in 2013, people have already started speculating that it might be a bubble.


Bitcoin Price Performance between 2010 and 2016 (Generated on coindesk.com)

How do I get a bitcoin?

There are three ways you can get your (virtual) hands on some bitcoins:
  1. Buy them from one of the exchanges
  2. Exchange Bitcoins for goods and services
  3. Mine them yourself


Mining Bitcoins... What does that mean?

The traditional paper currency comes into existence when the government decides to print and distribute money. However, bitcoins do not have a central bank. Bitcoins come into existence through a competitive and decentralised process called 'mining'. Miners use special software to solve highly complex mathematical codes and are issued a certain number of bitcoins on successfully solving it. Apart from being a smart way of issuing currency, this method also incentivizes more and more people to start mining Bitcoins.

Bitcoin Mining (Image courtesy: coindesk.com)

But why is it called mining?

Because we have limited resources similar to the mineral mines in the real world. In other words, there is only a set number of Bitcoins that can be created. According to the white paper, no more than 21 million bitcoins can be created in total. So far there are approximately 16 million Bitcoins in existence. This means that 5 million Bitcoins are yet to be mined. However, the lesser the number of Bitcoins left, the harder it becomes to mine a new one. 

What are the advantages, disadvantages and the concerns?

Now that we know about the bitcoin, let us have a look at its good, bad and ugly aspects:

The Good
  • The future of currency: Bitcoin has serious potential to completely revolutionise payment systems and transactions.
  • No third party required: As the Bitcoin is a distributed ledger currency, it does not require a bank or a trust to act as a mediator. The money can be transferred from one peer to another directly. 
  • No Costs: If you start accepting bitcoins, you need not incur any cost.
  • Easy to set up: To set up bitcoin, all you have to do is install and run the software.
  • Blockchain: Bitcoin uses blockchain, which has been touted as the future of the banking and financial services industry. This makes its case even stronger as the currency of the future.
  • Open Source: Bitcoin is an open source software. So developers across the globe can use their knowledge to improve Bitcoin. On the other hand, they can also build new services or software that can use Bitcoin.

The Bad
  • Volatility: As the market volatility suggests, people still do not have complete faith in Bitcoin. The past performance suggests that investing in the bitcoin is not for the faint-hearted.
  • Internet Access: Almost 60% of the world still does not have internet access. As Bitcoin requires access to the internet, its implementation is still not possible in these regions.

The Ugly
  • Robbed Exchanges: Bitcoins hackers have on certain occasions destabilised or 'robbed' some of the bitcoin exchanges. The most famous case is that of the Mt. Gox exchange, which had to file for bankruptcy protection in February 2014 after bitcoins worth $450 million were 'stolen' from the exchange.
  • The Dark Web: As it is easier to transact anonymously over the internet using darknets such as TOR and Invisible Internet Project (I2P), there have been cases where Bitcoin has been used for nefarious activities such as hiring a hitman, drug peddling and purchasing guns.
  • The Silk Road: The article would be incomplete without a mention of the Silk Road. This was the biggest online black market for illegal drugs before it was shut down by federal law enforcement. In short, it was like Ebay for criminals. With such highly critical threats, it seems like a herculean task to define a clear cut regulatory framework which ensures that such activities do not proliferate further.

Are we ready for Bitcoin?

So is it time to say goodbye to paper money and hello to digital currency? Not yet. In my opinion, it will take quite some time before such a digital currency comes into full-blown worldwide circulation. 
Yes, it is a good technology and is being adopted and experimented with. For instance, the Canadian government has recently come out with the MintChip which borrows a lot from the philosophy of bitcoin. This is one of the several instances of bitcoin implementation. However, due to the above-mentioned concerns, it will take another couple of years to make it a safe, secure and stable currency. Till then, we will have to make do with the era of paper currencies and quantitative easings. 

Recommended book about the Bitcoin:

If you are interested in knowing more about the Bitcoin and how it was created, I would highly recommend reading Digital Gold by Nathaniel Popper.
If there was ever a time to be excited about the insurance and financial services industry, it has to be right now. With the influence of FinTech increasing by leaps and bounds across the globe (Details: How finance is being taken over by tech), its close cousin InsurTech has been witnessing a very similar transformation.

(Image Courtesy: startupbootcamp.org)

Although the metamorphosis phase for InsurTech is expected to last quite long (till somewhere around 2025 according to current forecasts), there is a consensus among the industry experts that technology is going to be the main driving force for the insurance industry of the future. So if I were to identify three main technologies which, in my opinion, will lead the industry in the right direction, what would they be?

1. Internet of Things


Internet of Things (Image courtesy: insurancenetworking.com)
Imagine a world where data about activities such as your driving history, health condition and usage of home appliances were recorded. This data were then shared with insurance firms so that they could not only provide you with a customized quote exactly as per your lifestyle, but also reward you for maintaining a less risky way of living. That is the Internet of things (IoT) in a nutshell.

With the advent of big data analytics, it was only natural for firms to make use of IoT for better underwriting decisions, real-time monitoring and elimination of documentation for faster processing of claims. Although I mention just two main streams of IoT which have seen a lot of disruption in the past few years, the possibilities under the IoT umbrella are limitless.


  • Telematics: In collaboration with the automotive industry, this is a huge step towards promoting road safety and safer driving habits. The collection of real-time data will enable the insurers to provide better-suited plans to their customers. The type of data recorded will include things such as hours driven, distance covered, engine diagnostics, driving behavior and terrain driven on. In the event of a loss, it would be much more convenient to determine the cause of the accident and process the claim payouts accordingly. 


  • Wearables: With the overall population becoming more and more health conscious, health monitoring gadgets such as Fitbit, Jawbone and the Apple watch have been rapidly gaining ground. As the market penetration of these devices increases, insurers are looking to use the data collected by these wearables to provide a more personalized service to each and every customer. Apart from promoting healthy living, these devices can also be used to send payment reminders and other communication.


2. Blockchain

Blockchain has to be one the most revolutionary technologies ever in the world of finance. It completely redefines the way transactions should take place in the finance world. The technology is an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. The ledger itself can also be programmed to trigger transactions automatically.


I believe that Blockchain will play a very crucial role in transforming and reshaping the financial services industry because of its below mentioned attributes:

  • Decentralization: Blockchain presents a totally new way of thinking about transactions. By introducing the concept of distributed ledger it eliminates the need for a third party/trust (central ledger) for maintaining the records. This gives the power directly in the hands of the users, reduces the costs of running the system and increases the overall efficiency of the financial system. 
  • Transparency: Since the ledger is verifiable by all the nodes of the blockchain environment, all transactions have to be in sync. In other words, no single node/person can manipulate or alter the data. This crowdsourced security promotes openness, thereby leading to a highly transparent system, and can remarkably reduce cases of corruption and financial frauds.

In fact, if we look at the investment banking industry, according to a recent analysis by Accenture and McLagan (Banking on Blockchain), Blockchain technology could save more than $8 billion a year in total for the eight leading investment banks considered in their study. I believe that we can expect similar results for the insurance industry as well.

3. AI Driven Automation



Similar to the trend in other industries, a lot of jobs in the insurance industry will soon be automated. The introduction of chatbots and automated insurance agents is just the beginning. Processes such as underwriting, claims processing, and financial analytics seem to be the next ones in the pipeline. This would not only create a highly efficient framework, but also eliminate the chances of a human error in the insurance value chain. 

Insurers are also considering the possibility of clubbing multiple technologies to provide an entire ecosystem to the customer. This will enable the insurers to push package deals to their clients and lead to a win-win situation for both the insurer and the insured. The much needed evolution of the insurance industry has just begun. Looking at the endless possibilities that these technologies present, the insurance industry of the future is already starting to look super fascinating! 

The Solvency II guidelines finally came into effect on January 1, 2016. Since then there have been quite a few articles and discussions regarding the ORSA process, its implementation and the success of Solvency II so far. So I thought that I will share my two cents about what I have been able to decipher from my research on the ORSA. Here it goes. 

(Image Courtesy: asscompact.de)

What is Solvency II? 

Before diving into the ORSA process, it is imperative to know about the Solvency II directive. As the II in the name suggests, Solvency II was preceded by Solvency I directive, which was introduced in 1973. While Solvency I was all about a prudential valuation of liabilities and use of quantitative restrictions to manage the company's assets, Solvency II is more vast and covers a lot of bases including a highly risk-based approach and a well-defined risk review framework. In brief, the main motive of the Solvency II guidelines is to harmonise the risk management and reporting processes across the EU countries and later on across the globe. This is intended to bring about a unification of reporting standards across all the (re)insurance firms in the EU. Solvency II mandates (re)insurance firms to maintain a minimum specified level of admissible assets or risk-weighted assets, known as the Solvency Capital Requirement (SCR), so that they are able to meet their obligations under insurance contracts even during black swan events. To ensure that firms maintain a high quality of capital, Solvency II rewards well-diversified insurers with lower capital requirements and discourages those with high-risk accumulation, especially in a single geography or line of business. 

Solvency II Pillars 

The Solvency II directive consists of three pillars: 
  • Pillar 1 consists of quantitative requirements 
  • Pillar 2 lays down the requirements for the governance, risk management and effective supervision of insurers 
  • Pillar 3 focuses on the disclosure and transparency requirements 

What is ORSA? 

Out of the 3 pillars mentioned above, the ORSA comes under the purview of the second pillar. Every (re)insurance firm must conduct its own solvency and risk assessment as a part of its risk management process. This involves defining the firm's risk profile, setting the limits for risk tolerance and deciding its business strategy. Using these constraints, the firm can then extrapolate its prospective solvency positions as per various potential scenarios and stress conditions. The ultimate objective of this exercise is to ensure that firms have a healthy level of solvency at all times while having considered all kinds of potential risk scenarios. 

The Solvency II directive explains that the ORSA must include, at least, consideration of: 
  1. The undertaking’s overall solvency needs, taking into account the specific risk profile, approved risk tolerance limits and business strategy. 
  2. Continuous compliance with the Solvency II requirements for technical provisions and solvency capital. 
  3. The degree to which the undertaking’s risk profile deviates from the assumptions underlying the SCR, calculated with the standard formula or with its partial or full internal model. 
Being at the heart of the Solvency II directives, the ORSA requires a concerted effort by various departments such as the risk management, underwriting, actuarial, internal audit, finance, investments, compliance, internal audit and IT. 

In a nutshell, the ORSA is a set of processes constituting the tools for decision making and strategic analysis designed to help the board members of insurance and reinsurance firms make sound strategic decisions, to define the value created and to embed risk awareness throughout the organisation. 

Another clear and precise definition can be found in the ORSA Issues Paper (May 2008) by CEIOPS:

“the ORSA can be defined as the entirety of the processes and procedures employed to identify, assess, monitor, manage, and report the short and long term risks a (re)insurance undertaking faces or may face and to determine the own funds necessary to ensure that the undertaking’s overall solvency needs are met at all times.” 

An important thing to note here is that the ORSA is not a one-time exercise but a continuous and evolving process through which a firm monitors and manages its risk. The responsibility of ORSA lies with the undertaking's administrative or management body. In its report, the ORSA should consider all material risks which can impact the undertaking's ability to meet its obligations under insurance contracts. The ORSA should also form an integral part of the management process and decision-making framework. In case an internal model is used, the ORSA report must include an analysis of the differences in the assumptions and outputs between the Model SCR and the Standard SCR. Finally, the ORSA process should be appropriately evidenced, internally documented and independently assessed. 

Looking Ahead 

Although the Solvency II came into effect from the beginning of 2016, it is still in a transitional phase as most of the firms are facing teething problems in successfully implementing the directive. The deadline for full compliance for all firms still stands far away at January 1, 2032. In my opinion, although the implementation of ORSA and Solvency II greatly varies in degree from country to country, if we look at the overall progress in the EU, especially in the larger firms, the implementation has been quite successful. Some wrinkles in the implementation, such as the ultimate forward rate and the use of transitional measures in the solvency models, still need to be ironed out. However, looking at the benefit that the firms can derive from this directive, I would say that it has been well worth the effort!