Build RQuantLib on 32-bit Windows

Before you start, note that there is now a Windows binary of RQuantLib is available on CRAN.


Due to a change in how R-2.12.0 is built, CRAN maintainers could no longer provide a Windows binary of RQuantLib with the QuantLib library they had been using. I decided to try and build an updated QuantLib library from source, which would allow me (and them) to build the current RQuantLib.
Instructions for Getting Started with QuantLib and MinGW from Scratch by Terry August (found in QuantLib FAQ 3.2) were incredibly valuable.  Thanks to Dirk Eddelbuettel for helpful guidance and pointers while I was working through this exercise, and for useful comments on this blog post.

Margin Constraints with LSPM

When optimizing leverage space portfolios, I frequently run into the issue of one or more f$ ([Max Loss]/f) being less than the margin of its respective instrument.  For example, assume the required margin for an instrument is $500, f$ is $100, and $100,000 in equity.  The optimal amount to trade is 1,000 shares ($100,000/$100).  However, that would require $500,000 in equity, while you only have $100,000.  What do you do?

Estimating Probability of Drawdown

I’ve shown several examples of how to use LSPM’s probDrawdown() function as a constraint when optimizing a leverage space portfolio.  Those posts implicitly assume the probDrawdown() function produces an accurate estimate of actual drawdown.  This post will investigate the function’s accuracy.

Calculation Notes:
To calculate the probability of drawdown, the function traverses all permutations of the events in your lsp object over the given horizon and sums the probability of each permutation that hits the drawdown constraint.  The probability of each permutation is the product of the probability of each event in the permutation.

LSPM Joint Probability Tables

I’ve received several requests for methods to create joint probability tables for use in LSPM’s portfolio optimization functions.  Rather than continue to email this example to individuals who ask, I post it here in hopes they find it via a Google search. ;-)

I’m certain there are more robust ways to estimate this table, but the code below is a start…

# `x` is a matrix of market system returns
# `n` is the number of bins to create for each system
# `FUN` is the function to use to calculate the value for each bin
# `...` are args to be passed to `FUN`

jointProbTable <- function(x, n=3, FUN=median, ...) {

  # Load LSPM
  if(!require(LSPM,quietly=TRUE)) stop(warnings())

  # Function to bin data
  quantize <- function(x, n, FUN=median, ...) {
    if(is.character(FUN)) FUN <- get(FUN)
    bins <- cut(x, n, labels=FALSE)
    res <- sapply(1:NROW(x), function(i) FUN(x[bins==bins[i]], ...))
  }

  # Allow for different values of 'n' for each system in 'x'
  if(NROW(n)==1) {
    n <- rep(n,NCOL(x))
  } else
  if(NROW(n)!=NCOL(x)) stop("invalid 'n'")

  # Bin data in 'x'
  qd <- sapply(1:NCOL(x), function(i) quantize(x[,i],n=n[i],FUN=FUN,...))

  # Aggregate probabilities
  probs <- rep(1/NROW(x),NROW(x))
  res <- aggregate(probs, by=lapply(1:NCOL(qd), function(i) qd[,i]), sum)

  # Clean up output, return lsp object
  colnames(res) <- colnames(x)
  res <- lsp(res[,1:NCOL(x)],res[,NCOL(res)])

  return(res)
}

# Example
N <- 30
x <- rnorm(N)/100; y <- rnorm(N)/100; z <- rnorm(N)/100
zz <- cbind(x,y,z)

jpt <- jointProbTable(zz,n=c(4,3,3))
jpt
##                     x           y            z
## f         0.100000000  0.10000000  0.100000000
## Max Loss -0.009192644 -0.03090575 -0.006942066
##            probs            x            y            z
##  [1,] 0.06666667 -0.002152201 -0.030905750 -0.006942066
##  [2,] 0.06666667 -0.002152201 -0.006480683 -0.006942066
##  [3,] 0.03333333  0.024304901 -0.006480683 -0.006942066
##  [4,] 0.03333333 -0.009192644  0.001963339 -0.006942066
##  [5,] 0.06666667  0.008308007  0.001963339 -0.006942066
##  [6,] 0.03333333  0.024304901  0.001963339 -0.006942066
##  [7,] 0.03333333 -0.009192644 -0.006480683  0.001678969
##  [8,] 0.03333333  0.008308007 -0.006480683  0.001678969
##  [9,] 0.20000000 -0.009192644  0.001963339  0.001678969
## [10,] 0.06666667 -0.002152201  0.001963339  0.001678969
## [11,] 0.13333333  0.008308007  0.001963339  0.001678969
## [12,] 0.03333333  0.008308007 -0.006480683  0.013314122
## [13,] 0.03333333 -0.009192644  0.001963339  0.013314122
## [14,] 0.10000000 -0.002152201  0.001963339  0.013314122
## [15,] 0.06666667  0.008308007  0.001963339  0.013314122

Introducing IBrokers (and Jeff Ryan)

Josh had kindly invited me to post on FOSS Trading around the time when he first came up with the idea for the blog. Fast forward a year and I am finally taking him up on his offer.

I’ll start by highlighting that while all the software in this post is indeed free (true to FOSS), an account with Interactive Brokers is needed to make use of it. For those not familiar with IB, they offer a trading platform that excels on numerous fronts but is most appealing to those of us who trade algorithmically. IB makes available a rather comprehensive API that makes data access and trade execution entirely possible programmatically via a handful of “supported” languages. These include Java (the language of the platform), C#, VBA and even Excel. The also have a POSIX compliant C++ version for those who enjoy C++ but dislike Windows.

Thoughts on LSPM from R/Finance 2010

I just got back from R/Finance 2010 in Chicago. If you couldn’t make it this year, I strongly encourage you to attend next year. I will post a more comprehensive review of the event in the next couple days, but I wanted to share some of my notes specific to LSPM.

  • How sensitive are optimal-f values to the method used to construct the joint probability table?
  • Is there an optimizer better suited for this problem (e.g. CMA-ES, or adaptive differential evolution)?
  • How accurate are the estimates of the probability of drawdown, ruin, profit, etc.?
  • What could be learned from ruin theory (see the actuar package)?

These notes are mostly from many great conversations I had with other attendees, rather than thoughts I had while listening to the presentations. That is not a criticism of the presentations, but an illustration of the quality of the other conference-goers.

Historical / Future Volatility Correlation Stability

Michael Stokes, author of the MarketSci blog recently published a thought-provoking post about the correlation between historical and future volatility (measured as the standard deviation of daily close price percentage changes). This post is intended as an extension of his “unfinished thought”, not a critique.

He suggests using his table of volatility correlations as a back-of-the-envelope approach to estimate future volatility, which led me to question the stability of the correlations in his table. His table’s values are calculated using daily data from 1970-present… but what if you were to calculate correlations using only one year of data, rather than thirty? The chart below shows the results.