Skip to main content

How Can We Manage Valuation Prophet Workspaces?

Finally it comes the time to prepare for actuarial valuation work for new financial year 2013. I think it is the correct time for me to share my proposal to one of my clients, Actuarial Department of Company A,  in order to improve how they can manage their valuation Prophet workspaces.

Existing Approach

Apart from monthly valuation (i.e. computing statutory reserves), Company A performed various valuation exercises, such as market-consistent embedded value ("MCEV"), on regular basis. Currently, they use a centralized Prophet valuation workspaces for all sorts of valuation exercises, which designated run numbers are assigned to different types of exercises.

In order to segregate runs for different valuation months, the Prophet workspace is duplicated every month (including the relevant tables) into a new folder. Of course, all workspaces are saved in a designated drive in the server.


Although there are some advantages using this approach, I do share with Company A its shortcomings:

  • Under utilization of run numbers - Each Prophet workspace allows for up to 99 runs in a single workspace. By using this approach, many run numbers in a particular may not be utilized - in simple words, many run numbers are "wasted". On the other hand, the December workspace (i.e. financial year end) may not have enough run numbers to cater for all sorts of analysis - especially those only done annually.
     
  • Housekeeping difficulties - No one likes to do housekeeping, but this is the task we need to do regularly ("Yes we hate it but we have to do it..."). Apart from increasing the housekeeping workloads (due to too many workspaces), it will also cause dispute on who should "zip" up the result files and backup accordingly - it is not efficient to have many teams doing housekeeping on a single workspace, and definitely it is not fair to appoint an UNFORTUNATE staff (normally junior staffs will be prospective "candidates") to "zip" files & do backups.
     
  • Different needs in different valuation exercises - There are different requirements for different actuarial exercises. For example, for a annual budget workspace, we may need to create hypothetical products to project the new business for planned new products (which may require coding modifications in the library); however, for monthly valuation workspaces, it may be inappropriate to create such hypothetical products in the workspaces - especially if the required coding modifications impact reserves. Furthermore, monthly valuation workspaces may want to different timing of coding updates - the monthly valuation team may want to do several coding changes at once (say) quarterly, especially those have minor impacts to reserves (your appointed actuary may question you if you need to update your coding every month but minor impacts... Furthermore, you (as well as your boss) need to do testing-checking-review-documentation exercise every month until you don't have time to date your BF / GF...).

Proposed Approach

Hence, in order to overcome the above-mentioned shortcomings, I have made the following proposal to my client. In my view, a Prophet workspace, which consists of actuarial models used to produce various calculations, should be properly controlled. Apart from differentiating developer / user access rights, by right Prophet Manager should be the ONLY ONE to create a new PRODUCTION workspace, which the respective users should submit a request (ensure the process is simple - no multiple forms & sign-off, please) when a new workspace is needed.


My proposal is:
  • Designated workspaces for each valuation team - Setup separate workspaces for different valuation team, or even by actuarial exercise if needed. For example, a particular valuation team responsible for both embedded value and budget exercises may want to have separate workspaces for each exercise (as the requirements are different).
     
  • Continuous use of workspaces - Utilize as many run number as possible in a workspace, until a new workspace is created to replace the workspace.

    For example, we create a monthly valuation workspace and name it as "p_mv13a". If there is no revision needed for Jan '13-Jun '13, we can continue to use p_mv13a to perform monthly valuation runs for Jan '13-Jun '13. In case a coding revision is needed during Jul '13, we can create the revised workspace and name it as "p_mv13b". Apart from managing less workspaces, I think you can easily see that your housekeeping work is lesser.
     
  • When to create a new workspace? - In order to control number of production workspaces, I would suggest the following approach when we want to introduce a new product to a workspace:

    - If the new product doesn't involve any coding modifications in the library (i.e. only modify the definitions in input variables) and doesn't require change of structure in any table, I would think that it is OK to create the new product in the existing workspace - instead of creating a new workspace for this purpose - as this addition of new product doesn't affect existing products / run structure / run settings. Of course, we need to properly define the workspace version (such as updating the workspace version from 1.0 to 1.1).

    - If the new product created to replace an existing product (e.g. split an existing product into 2 products), I would think it is necessary to create a new workspace. The existing run structure / run setting containing the replaced product are no longer workable without modification.
     
  • Use run control form - Use a run control form to document the run activities, so that we have a proper reference in the future in case we need to use / check a specific run. Of course, such documentation is not a pleasant thing to do - since we already have run log for each run, the run control form should be simple and with SOFTCOPY (so that we can duplicate and existing run control form and update the form easily).

    The above-mentioned "run activities" include tables updated, product selected and error/warnings handling (especially those we have ignored).
     
  • Try to make Prophet run error free - I would recommend that we should try to make a Prophet run error free - in case there is no longer any in force policy for an old product, please remove from the run structure. If we keep ignoring errors arising from Prophet run (which most of them are missing model point files), it would be possible that we will overlook the REAL run error - which we may only discover when we analyze results or NOT discover it at all! Furthermore, having many errors in a Prophet run will require additional efforts to check the run log.
     
  • Housekeeping & backup regularly, please - Although disk space is quite cheap nowadays, it is still a good practice to do housekeeping & backup regularly. If we "zip" up previous result files regularly, it will not only help to reduce the time required for backup - it also help us to ensure enough disk spaces for future Prophet runs (of course you don't want to find out that you have to "zip" files when you are running out of time...).

    If you store your workspaces and run results in a server, please ensure that your IT colleagues do necessary backup regularly.

Comments

  1. Unlock the power of financial insights with Mithras Consultants' Actuarial Valuation expertise.

    ReplyDelete
  2. Navigate the complex landscape of financial planning and risk management with precision through Mithras Consultants' specialized Actuarial Valuation services.

    ReplyDelete

Post a Comment

Popular posts from this blog

Other Ways to Prepare Prophet Model Point Files? Try FoxPro!

If you are a frequent Prophet user, you must be definitely familiar with "Model Point Files" ("MPF"), i.e. the policy data that you compile in a specific format that Prophet can recognize as inputs for Prophet runs. If my guess is right, most probably you use Data Conversion System ("DCS") to convert the source data you download from your policy administration system (may be in various format, e.g. fixed ASCII / fixed width, comma delimited, tab delimited, etc.) into the required MPF, based on the definitions you specified in your DCS programs. In case your source data are divided into various files, you may need to carry out more steps to prepare MPF, for example: Open different source files in Excel Combine required fields from different source files by using commands (such as) VLOOKUP, SUMIF & MATCH. You may have automated this process by using VBA. Convert the required files into text files Use DCS to convert those text files into MPF  How

How do You Setup Indicators in Your Prophet Model?

In my previous post, I have discussed areas that you may consider in setting up input variables . Before I proceed my discussion on core variables, I would think that it is good for use to take a look on indicators - as indicators play an important role in setting core variables with various formulas (known as "variable definitions" in a Prophet model).  The funny thing about indicators is: it DOES NOT carry any formulas that you need to use in calculating a variable! Basically, they are just some texts that you can combine using different logical operators (AND, OR, NOT), such as "ORD_PAR AND SING_PREM" for a single premium ordinary participating product. 

How Should I Do My UAT? (1): Program / Spreadsheet / Model

If you are working in the actuarial department of an insurance company, it may not be uncommon to you involving in various testing exercises, which are known as User Acceptance Testing ("UAT") in general. Some of us may think that UAT is only applicable to the policy administration system implementation project (Oops... Do I remind on the non-stop follow-ups from your Project Manager?), but in fact UAT is also needed for the programs, spreadsheet templates or actuarial model we use to perform various actuarial studies - whether they are developed internally or by external consultant. It is particularly important for those tools which we are going to use to perform regular studies. In my view, there is no one right answer for the approach to be used in performing UAT on program / spreadsheet / model - most importantly, the approach that you adopt should allow the required testing to be carried out in a structured manner, and it should not create too much "processes"