Aaron N. Tubbs bio photo

Aaron N. Tubbs

Dragon chaser.

Twitter Facebook Google+ LinkedIn Github

We got another new resource the other day that I’m going to have to train and get up to speed. The hope is that (third time is a charm?) we’ll get somebody on board that can handle the normal day to day crap on my system, so I can focus on project work and a transition out of support. Until that point, I’m too buried in the regular din to be able to escape, unless I were to quit outright.

While I can say that previous resources with which I worked lacked the talent or aptitude to do what was necessary (and before you tell me I can break down tasks to the point that anybody can provide assistance, I can assure you there is a level at which the overhead and risk outweighs said benefits), I also think that the method I used to train them may well be flawed. The problem is that when I want to do something, I want to understand how it works. For example, I support a risk and pricing engine for synthetic CDO deals. My job would be less interesting (yes, it’s possible) and more difficult if I didn’t understand what Monte Carlo simulations are. I do not think I would be as effective at support if I didn’t have at least a crude understanding of finance, risk measures, credit derivatives, and structured synthetic investments.

With this in mind, when I first train a new resource, I run through bond basics, derivative fundamentals, introduce credit derivatives, build up to structured investments, talk about synthetics, and get to the point where I can say “this is hard” and they can say “yeah, I could see that.” From there, we go on to high-level instruction on how we solve the problem, and then dig into specific details necessary for support. Ultimately, what I did was structure the information in the way that I would want to process it were I coming into a new team and trying to learn the system.

The strange thing is, I learned the system by supporting it, and learning and growing from the various problems that erupted. I did receive six weeks of financial training as part of the graduate recruiting program, but the sort of stuff we were playing with was a lot more sophisticated than anything we touched on in class. It took a while, but ultimately I picked the whole thing up. In a way, I would term this a sort of amortized learning approach, as I picked up what I needed to know when I needed to know it. Maybe that is the way I should approach a new hire; give them a task and say “This is broken, fix it!” and give them only enough information to shoot themselves in the foot. It’s sort of like throwing a person into Provence in order to teach the French language.

I’m not sure this approach is any more effective, and I suppose the ultimate answer is that each person has a particular training approach that works better. Faced with the problem that I really don’t know new hires that well until I’ve worked with them for a while, I’ve got to make some up-front training decisions, and adapt them as time goes on. I suppose then the best answer is to ask people how they learn best, and cater to that … but that assumes they actually know the answer to that question. As such, I think I’m going to try and wing it this time, rather than the carefully coordinated onslaught. I’m not going to explain a credit event, the calculation of a bond price, or the architecture of our reporting system. I’m just going to give him a problem, and hope that he starts asking the right questions. We’ll see what happens.