Sunday, December 26, 2010

R/Finance 2011 Call for Papers

The 2011 R/Finance conference has an updated call for papers.  Dirk Eddelbuettel announced it to the R-SIG-Finance mailing list.  I've reproduced his email in its entirety below.  Let me know if you plan on attending.

Subject: R/Finance 2011: Call for Papers: Now with prizes and travel money

Dear R / Finance community,

The preparations for R/Finance 2011 are progressing, and due to favourable responses from the different sponsors we contacted, we are now able to offer
  1. a competition for best paper, which given the focus of the conference will award for both an 'academic' paper and an 'industry' paper
  2. availability of travel grants for up to two graduate students provided suitable papers were accepted for presentations
More details are below in the updated Call for Papers. Please feel free to re-circulate this Call for Papers with colleagues, students and other associations.

Cheers, and Season's Greetings,
Dirk (on behalf of the organizing / program committee)

Call for Papers:

R/Finance 2011: Applied Finance with R
April 29 and 30, 2011
Chicago, IL, USA

The third annual R/Finance conference for applied finance using R will be held this spring in Chicago, IL, USA on April 29 and 30, 2011.  The two-day conference will cover topics including portfolio management, time series analysis, advanced risk tools, high-performance computing, market microstructure and econometrics. All will be discussed within the context of using R as a primary tool for financial risk management, portfolio construction, and trading.

Complete papers or one-page abstracts (in txt or pdf format) are invited to be submitted for consideration. Academic and practitioner proposals related to R are encouraged. We welcome submissions for full talks, abbreviated "lightning talks", and for a limited number of pre-conference (longer) seminar sessions.

Presenters are strongly encouraged to provide working R code to accompany the presentation/paper.  Data sets should also be made public for the purposes of reproducibility (though we realize this may be limited due to contracts with data vendors). Preference may be given to presenters who have released R packages.

The conference will award two $1000 prizes for best paper: one for best practitioner-oriented paper and one for best academic-oriented paper.  Further, to defray costs for graduate students, two travel and expense grants of up to $500 each will be awarded to graduate students whose papers are accepted.  To be eligible, a submission must be a full paper; extended abstracts are not eligible.

Please send submissions to: committee "at"

The submission deadline is February 15th, 2011.  Early submissions may receive early acceptance and scheduling.  The graduate student grant winners will be notified by February 23rd, 2011.

Submissions will be evaluated and submitters notified via email on a rolling basis. Determination of whether a presentation will be a long presentation or a lightning talk will be made once the full list of presenters is known.

R/Finance 2009 and 2010 included attendees from around the world and featured keynote presentations from prominent academics and practitioners. 2009-2010 presenters names and presentations are online at the conference website. We anticipate another exciting line-up for 2011--including keynote presentations from John Bollinger, Mebane Faber, Stefano Iacus, and Louis Kates.  Additional details will be announced via the conference website as they become available.

For the program committee:

       Gib Bassett, Peter Carl, Dirk Eddelbuettel, Brian Peterson,
       Dale Rosenthal, Jeffrey Ryan, Joshua Ulrich

Tuesday, December 14, 2010

Why Use R?

I use R very frequently and take for granted much that it has to offer.  I forget how R is different from similar tools, so I have trouble communicating the benefits of using R.  The goal of this post is to highlight R's main strengths, but first... my story.

How I got started with R

I was introduced to R while I was working as a Research Analyst at the Federal Reserve Bank of St. Louis.  I wanted to do statistical analysis at home but the tools I used at work (GAUSS and SAS) were expensive, so I started doing my analysis in Excel.

But as my analysis became more complex, the Excel files became large and cumbersome.  The files also did not document my thought process, which made it difficult to revisit analysis I had started several months earlier.  I asked my fellow analysts for advice and one introduced me to R and Modern Applied Statistics with S.  Thus began my auto-didactic journey with R.

Why should you use R?

R is the leading tool for statistics, data analysis, and machine learning.  It is more than a statistical package; it’s a programming language, so you can create your own objects, functions, and packages.
Speaking of packages, there are over 2,000 cutting-edge, user-contributed packages available on CRAN (not to mention Bioconductor and Omegahat).  To get an idea of what packages are out there, just take a look at these Task Views.  Many packages are submitted by prominent members of their respective fields.
Like all programs, R programs explicitly document the steps of your analysis and make it easy to reproduce and/or update analysis, which means you can quickly try many ideas and/or correct issues.
You can easily use it anywhere.  It's platform-independent, so you can use it on any operating system.  And it's free, so you can use it at any employer without having to persuade your boss to purchase a license.
Not only is R free, but it's also open-source.  That means anyone can examine the source code to see exactly what it’s doing.  This also means that you, or anyone, can fix bugs and/or add features, rather than waiting for the vendor to find/fix the bug and/or add the feature--at their discretion--in a future release.
R allows you to integrate with other languages (C/C++, Java, Python) and enables you to interact with many data sources: ODBC-compliant databases (Excel, Access) and other statistical packages (SAS, Stata, SPSS, Minitab).
Explicit parallelism is straightforward in R (see the High Performance Computing Task View): several packages allow you to take advantage of multiple cores, either on a single machine or across a network.  You can also build R with custom BLAS.
R has a large, active, and growing community of users.  The mailing lists provide access to many users and package authors who are experts in their respective fields.  Additionally, there are several R conferences every year.  The most prominent and general is useR.  Finance-related conferences include Rmetrics Workshop on Computational Finance and Financial Engineering in Meielisalp, Switzerland and R/Finance: Applied Finance with R in Chicago, USA.
I hope that's a helpful overview of some benefits of using R.  I'm sure I have forgotten some things, so please add them in the comments.

Tuesday, December 7, 2010

Build RQuantLib on 32-bit Windows

Before you start, note that there is now a Windows binary of RQuantLib is available on CRAN.

Due to a change in how R-2.12.0 is built, CRAN maintainers could no longer provide a Windows binary of RQuantLib with the QuantLib library they had been using. I decided to try and build an updated QuantLib library from source, which would allow me (and them) to build the current RQuantLib.
Instructions for Getting Started with QuantLib and MinGW from Scratch by Terry August (found in QuantLib FAQ 3.2) were incredibly valuable.  Thanks to Dirk Eddelbuettel for helpful guidance and pointers while I was working through this exercise, and for useful comments on this blog post.

Here are the steps I took.  You will need to modify the paths to suit your particular setup.
  1. Download and install Rtools.
  2. Download and install MinGW.
  3. Download boost (I used boost_1_42_0.tar.gz)
    unzip to c:/R/cpp/boost_1_42_0
    We only need the headers, so there's nothing to install.
  4. Download QuantLib (I used
    unzip to c:/R/cpp/QuantLib-1.0.1
  5. Install Quantlib. The make and make install commands are going to take quite some time. I think they took about 2 hours on my 3.4Ghz system. Let's get started. Open a msys command line and run:
    set PATH=c:/MinGW/bin:$PATH
    cd c:/R/cpp
    mkdir lib include
    cd QuantLib-1.0.1
    configure --with-boost-include=c:/R/cpp/boost_1_42_0 --prefix=c:/R/cpp
    make install
    cd c:/R/cpp/lib
    cp libQuantLib.a libQuantLib.a.bak
    strip --strip-unneeded libQuantLib.a
  6. Download the RQuantlib source (I used RQuantLib_0.3.4.tar.gz)
    unzip it to c:/R/cpp/RQuantLib
  7. Open c:/R/cpp/RQuantLib/src/ and ensure
  8. Make the following directories:
    then copy:
    c:/R/cpp/boost_1_42_0/boost to c:/R/cpp/QuantLibBuild/boost
    c:/R/cpp/include/ql to c:/R/cpp/QuantLibBuild/ql
    c:/R/cpp/lib/libQuantLib.a to c:/R/cpp/QuantLibBuild/lib/libQuantLib.a
  9. Now you should be able to build RQuantLib via:
    set QUANTLIB_ROOT=c:/R/cpp/QuantLibBuild
    R CMD INSTALL RQuantLib_0.3.4.tar.gz
I cannot guarantee these instructions will work on a 64-bit system because I do not have access to a 64-bit Windows machine, but the steps should be fairly similar.  If you run into any issues, feel free to leave a comment and I will do my best to help.

If you just want to use my build, you can install this RQuantLib_0.3.4 Windows binary.

Friday, November 12, 2010

Risk-Opportunity Analysis

I will be attending Ralph Vince's risk-opportunity analysis workshop in Tampa this weekend.  Drop me a note if you're in the area and would like to meet for coffee / drinks.

Monday, October 25, 2010

Algorithmic Trading with IBrokers

Kyle Matoba is a Finance PhD student at the UCLA Anderson School of Management.  He gave a presentation on Algorithmic Trading with R and IBrokers at a recent meeting of the Los Angeles R User Group.  The discussion of IBrokers begins near the 12-minute mark.

Saturday, August 28, 2010

Patrick Burns is blogging

Patrick Burns is the author of several helpful R resources, including A Guide for the Unwilling S User, The R Inferno, and S Poetry. He also wrote one of my favorite critiques of Microsoft Excel: Spreadsheet Addiction.

His writing is witty, entertaining, and packed fully of useful bits of information.  I strongly recommend you add his blog to your list of regular reading material.

Sunday, August 1, 2010

Margin Constraints with LSPM

When optimizing leverage space portfolios, I frequently run into the issue of one or more f$ ([Max Loss]/f) being less than the margin of its respective instrument.  For example, assume the required margin for an instrument is $500, f$ is $100, and $100,000 in equity.  The optimal amount to trade is 1,000 shares ($100,000/$100).  However, that would require $500,000 in equity, while you only have $100,000.  What do you do?

Page 341 of The Handbook of Portfolio Mathematics outlines how to determine how many units to trade, given margin constraints.  The methodology therein suggests finding optimal f values first, then calculating the portfolio that satisfies the margin constraints but keeps the ratio of each market system to one another the same.

For those without the book, the calculation is:
L = max(f$) / sum( ( max(f$) / f$[i] ) * margin[i] )

L = percentage of "active equity" to use when dividing by each f$
margin = initial margin for each market system

The maxUnits function included in this post uses the formula above to return the maximum number of tradable units.  In this example, we assume our margin is equal to our maximum loss (as is the case with equities).  The code below illustrates how to use the maxUnits function after optimization.

# Load the LSPM package

maxUnits <- function(lsp, margin, equity) {

  # Make sure margin and f are same length
  NRf <- NROW(lsp$f)
    stop(paste("'margin' must have length =",NRf))

  # Calculate maximum equity percentage
  fDollar <- -lsp$maxLoss / lsp$f
  maxfDollar <- max( fDollar[is.finite(fDollar)] )

  den <- maxfDollar / fDollar * margin
  den[!is.finite(den)]  <- 0

  eqPct <- min(1, maxfDollar/sum(den))

  max.units <- eqPct * equity / fDollar

data(port)               # Multiple strategy data
initEq <- 100000         # Initial equity
margin <- -port$maxLoss  # Margin amounts

opt <- optimalf(port)    # Optimize portfolio
port$f <- opt$f          # Assign optimal f values to lsp object

# Units to trade
fUnits <- initEq/(-port$maxLoss/port$f)  # unconstrained
mUnits <- maxUnits(port, margin, initEq) # margin-constrained

# Equity needed to trade at f values
sum(fUnits*margin)  # unconstrained
sum(mUnits*margin)  # margin-constrained

# Implied f values based on maximum units
port$f <- mUnits*-port$maxLoss/initEq
GHPR(port)  # 1.209931

Note that the effect of the maxUnits function is to lower the optimal f values to a level within the margin constraints.  The GHPR for the portfolio falls from 1.2939 without margin constraints to 1.20991 when post-optimization margin constrains are imposed.

As I investigated this method, I wondered if optimal f values would be the same if the margin constraints were included in the objective function.  I was concerned that the post-optimization decrease in f values would be sub-optimal because a different mix of f values--that also meet the margin constraints--may have a higher GHPR.

The next block of code optimizes the portfolio with margin constraints included in the objective function (this functionality is available starting in revision 43).

# Optimize portfolio with margin constraints
opt <- optimalf(port, equity=initEq, margin=margin)
port$f <- opt$f          # Assign optimal f values to lsp object

# Units to trade
fUnits <- initEq/(-port$maxLoss/port$f)  # unconstrained
mUnits <- maxUnits(port, margin, initEq) # margin-constrained

# Equity needed to trade at f values
sum(fUnits*margin)  # unconstrained
sum(mUnits*margin)  # margin-onstrained

# Implied f values based on maximum units
fImp <- mUnits*-port$maxLoss/initEq

When the margin contraints are included in the objective function, fUnits and mUnits are the same, which means the implied f values are the same as the optimal f values and required equity is less than or equal to available equity.

In addition, we see that the post-optimzation method arrives at sub-optimal f values, since it arrived at a GHPR of 1.209931 while including margin constraints in the objective function achived a GHPR of 1.2486.

Saturday, June 19, 2010

Estimating Probability of Drawdown

I've shown several examples of how to use LSPM's probDrawdown function as a constraint when optimizing a leverage space portfolio.  Those posts implicitly assume the probDrawdown function produces an accurate estimate of actual drawdown.  This post will investigate the function's accuracy.

Calculation Notes:
To calculate the probability of drawdown, the function traverses all permutations of the events in your lsp object over the given horizon and sums the probability of each permutation that hits the drawdown constraint.  The probability of each permutation is the product of the probability of each event in the permutation.

In the example below, there are 20 events in each lsp object and the probability of drawdown is calculated over a horizon of 10 days, yielding 20^10 permutations to traverse - for each iteration.  So don't be surprised when the code takes more than a couple minutes to run.

For a more detailed discussion about the calculation, see:
pp. 89-138 of The Leverage Space Trading Model, and/or
pp. 377-414 of The Handbook of Portfolio Mathematics
both by Ralph Vince.

The results below were run on daily SPY from 2008-01-01 to 2009-01-01, using 20 days of data to estimate the probability of a 5% drawdown over the next 10 days.  Results on daily QQQQ over the same period, and monthly SPX from 1950-present produced similar results.

I chose a prediction horizon of 10 periods to provide a fairly smooth actual probability of drawdown curve without making the probDrawdown calculation time too long.  Using 10 (instead of 20) days of data in the lsp object only changed results slightly.

The chart below shows that probDrawdown nearly always over estimates actual drawdown over the next 10 periods and hardly ever under estimates it.  While it's comforting the function doesn't under estimate risk, I would prefer a less biased estimator.

Notice that the above calculation assumes each event is independently distributed.  Brian Peterson suggested a block bootstrap to attempt to preserve any dependence that may exist.  My next post will investigate if that materially improves estimates.


# Pull data and calculate differences
symbol <- "SPY"
getSymbols(symbol, from="2008-01-01", to="2009-01-01")
sym <- get(symbol)
symDiff <- diff(Cl(sym))
symDiff[1] <- 0

NP <- 20    # number of periods to use in lsp object
HR <- 10    # drawdown horizon
DD <- 0.05  # drawdown level

# Initialize projected / actual drawdown objects
NR <- NROW(symDiff)
prjDD <- xts(rep(0,NR),index(symDiff))
actDD <- xts(rep(0,NR),index(symDiff))

# Socket cluster with snow to speed up probDrawdown()
cl <- makeSOCKcluster(2)

# Start loop over data
for( i in (NP+1):(NR-HR) ) {
  # Objects to hold data for the last 20 days and next 10 days
  last20 <- symDiff[(i-NP):i]
  next10 <- symDiff[(i+1):(i+HR)]
  maxLoss <- -Cl(sym)[i]

  # Portfolios to estimate drawdown and calculate actual drawdown
  prjPort <- lsp(last20, f=1, maxLoss=maxLoss)
  actPort <- lsp(next10, f=1, maxLoss=maxLoss)
  # Estimate probability of drawdown
  prjDD[i] <- probDrawdown(prjPort, DD, HR, snow=cl)
  # Calculate actual drawdown probability
  actDD[i] <- sum(HPR(actPort)/cummax(HPR(actPort)) <= (1-DD)) / HR
# End loop over data

# Chart results

Tuesday, May 18, 2010

LSPM Joint Probability Tables

I've received several requests for methods to create joint probability tables for use in LSPM's portfolio optimization functions.  Rather than continue to email this example to individuals who ask, I post it here in hopes they find it via a Google search. ;-)

I'm certain there are more robust ways to estimate this table, but the code below is a start...

'x' is a matrix of market system returns
'n' is the number of bins to create for each system
'FUN' is the function to use to calculate the value for each bin
'...' are args to be passed to 'FUN'

jointProbTable <- function(x, n=3, FUN=median, ...) {

  # Load LSPM
  if(!require(LSPM,quietly=TRUE)) stop(warnings())

  # Function to bin data
  quantize <- function(x, n, FUN=median, ...) {
    if(is.character(FUN)) FUN <- get(FUN)
    bins <- cut(x, n, labels=FALSE)
    res <- sapply(1:NROW(x), function(i) FUN(x[bins==bins[i]], ...))

  # Allow for different values of 'n' for each system in 'x'
  if(NROW(n)==1) {
    n <- rep(n,NCOL(x))
  } else
  if(NROW(n)!=NCOL(x)) stop("invalid 'n'")

  # Bin data in 'x'
  qd <- sapply(1:NCOL(x), function(i) quantize(x[,i],n=n[i],FUN=FUN,...))

  # Aggregate probabilities
  probs <- rep(1/NROW(x),NROW(x))
  res <- aggregate(probs, by=lapply(1:NCOL(qd), function(i) qd[,i]), sum)

  # Clean up output, return lsp object
  colnames(res) <- colnames(x)
  res <- lsp(res[,1:NCOL(x)],res[,NCOL(res)])

# Example
N <- 30
x <- rnorm(N)/100; y <- rnorm(N)/100; z <- rnorm(N)/100
zz <- cbind(x,y,z)
(jpt <- jointProbTable(zz,n=c(4,3,3)))

Thursday, May 13, 2010

Introducing IBrokers (and Jeff Ryan)

Josh had kindly invited me to post on FOSS Trading around the time when he first came up with the idea for the blog. Fast forward a year and I am finally taking him up on his offer.

I'll start by highlighting that while all the software in this post is indeed free (true to FOSS), an account with Interactive Brokers is needed to make use of it. For those not familiar with IB, they offer a trading platform that excels on numerous fronts but is most appealing to those of us who trade algorithmically. IB makes available a rather comprehensive API that makes data access and trade execution entirely possible programmatically via a handful of "supported" languages. These include Java (the language of the platform), C#, VBA and even Excel. The also have a POSIX compliant C++ version for those who enjoy C++ but dislike Windows.

For those who dislike Windows and C++, the community of IB users have a few "non-official" options. They include some nice implementations in C, Python (2), Matlab, and something even more abstracted in the trading-shim. While all well and good, there was one missing: R.

Many of you may know I am a rather large proponent of R. I have authored or coauthored quite a few packages and help to organize the R/Finance conference in Chicago each Spring. I am also a huge single-language solution kind of guy. If I could order food and surf the internet from R, all the world would be mine. But I digress...

The IBrokers package on CRAN is my contribution to the landscape. A pure R implementation of most of the API, using nothing but R and some magic. It is now possible to connect to a running TWS (aka Trader Workstation) and retrieve historical data, request market data feeds, and even place orders -- all from R.

You can get a TWS client from IB at the link above, and installing IBrokers is easy enough from R:
> install.packages("IBrokers")
Next up would be to make sure that your TWS has sockets enable, and you have your localhost entered as a "trusted IP".

First find the "Configure" menu in the TWS

Next we check for "Enable ActiveX and Socket Clients"

To add a "Trusted IP" click on "All API Settings..."

Okay, that was easy. Now we are back to R code. Next we need to load our freshly installed IBrokers package and connect.
> library(IBrokers)
Loading required package: xts
Loading required package: zoo
IBrokers version 0.2-7: (alpha)
Implementing API Version 9.62
This software comes with NO WARRANTY. Not intended for production use!
See ?IBrokers for details
> tws <- twsConnect()
> tws
<twsConnection,1 @ 20100513 15:11:40 CST, nextId=1288>
As you can see there isn't too much to talk about in the code above. We make the standard R library() call to get IBrokers into our session, and the use the twsConnect function to make a connection to the TWS. There are parameters that can be passed in (such as host, and connection ID), but we needn't do that here.

The result of our call is a twsConnection object. This contains a few important bits of information that are used throughout the lifetime of the object.

To wrap up this post we'll use our new connection to fetch some historical data from IB.
> aapl <- reqHistoricalData(tws, twsSTK("AAPL"))
TWS Message: 2 -1 2104 Market data farm connection is OK:usfuture
TWS Message: 2 -1 2104 Market data farm connection is OK:usfarm
waiting for TWS reply ....... done.
Some notes about the above. The first argument to most any IBrokers call is the connection object created with twsConnect. The second argument to the above request is a twsContract object. There are a variety of ways to construct this, and twsSTK is just a shortcut from IBrokers that allows for equity instruments to be specified. The object is just a list of fields that contain data IB needs to process your requests:
> twsSTK("AAPL")
List of 14
$ conId : num 0
$ symbol : chr "AAPL"
$ sectype : chr "STK"
$ exch : chr "SMART"
$ primary : chr ""
$ expiry : chr ""
$ strike : chr "0.0"
$ currency : chr "USD"
$ right : chr ""
$ local : chr ""
$ multiplier : chr ""
$ combo_legs_desc: NULL
$ comboleg : NULL
$ include_expired: chr "0"
As you may have noticed, we assigned the output of our request to a variable appl in our workspace. Taking a look at it reveals it is an xts object of our daily bars for the last 30 calendar days.
> str(aapl)
An ‘xts’ object from 2010-04-14 to 2010-05-13 containing:
Data: num [1:22, 1:8] 245 246 249 247 248 ...
- attr(*, "dimnames")=List of 2
..$ : NULL
..$ : chr [1:8] "AAPL.Open" "AAPL.High" "AAPL.Low" "AAPL.Close" ...
Indexed by objects of class: [POSIXt,POSIXct] TZ: America/Chicago
xts Attributes:
List of 4
$ from : chr "20100413 21:35:34"
$ to : chr "20100513 21:35:34"
$ src : chr "IB"
$ updated: POSIXct[1:1], format: "2010-05-13 15:35:36.396084"
The reqHistoricalData call takes a few arguments that can specify the barSize and duration of the data that is returned. Be warned that not all combinations work, not all working combinations are applicable to all contract types, and there are strict limits on how many queries you can make in any time period. These are IB enforced limitations and often are a source of great frustration when trying to reconcile why your simple request has failed. More information regarding the details of what works and when can be found in the IBrokers documentation, as well as the more authoritative reference from IB.

Next time we'll explore the real-time data features of IBrokers, including live market data, real-time bars, and order-book data capabilities.

Sunday, April 18, 2010

Thoughts on LSPM from R/Finance 2010

I just got back from R/Finance 2010 in Chicago. If you couldn't make it this year, I strongly encourage you to attend next year. I will post a more comprehensive review of the event in the next couple days, but I wanted to share some of my notes specific to LSPM.
  • How sensitive are optimal-f values to the method used to construct the joint probability table?
  • Is there an optimizer better suited for this problem (e.g. CMA-ES, or adaptive differential evolution)?
  • How accurate are the estimates of the probability of drawdown, ruin, profit, etc.?
  • What could be learned from ruin theory (see the actuar package)?
These notes are mostly from many great conversations I had with other attendees, rather than thoughts I had while listening to the presentations. That is not a criticism of the presentations, but an illustration of the quality of the other conference-goers.

    Sunday, April 11, 2010

    Historical / Future Volatility Correlation Stability

    Michael Stokes, author of the MarketSci blog recently published a thought-provoking post about the correlation between historical and future volatility (measured as the standard deviation of daily close price percentage changes). This post is intended as an extension of his "unfinished thought", not a critique.

    He suggests using his table of volatility correlations as a back-of-the-envelope approach to estimate future volatility, which led me to question the stability of the correlations in his table. His table's values are calculated using daily data from 1970-present... but what if you were to calculate correlations using only one year of data, rather than thirty? The chart below shows the results.

    The chart shows the rolling one-year (252-day) correlations for the diagonal in Michael's table (e.g. historical and future 2-day volatility, ..., historical and future 252-day volatility). You can see the shorter periods are generally more stable, but are also closer to zero. The rolling one-year correlation between historical and future one-year volatility swings wildly from +/-1 over time.

    This isn't to argue that Michael's back-of-the-envelope approach is incorrect, rather it is an attempt to make the approach more robust by weighing long-term market characteristics against recent market behavior.

    For those interested, here is the R code I used to replicate Michael's table and create the graph above. An interesting extension of this analysis would be to calculate volatility using TTR's volatility() function instead of standard deviation. I'll leave that exercise to the interested reader.


    # pull SPX data from Yahoo Finance

    # volatility horizons
    GSPC$v2 <- runSD(ROC(Cl(GSPC)),2)
    GSPC$v5 <- runSD(ROC(Cl(GSPC)),5)
    GSPC$v10 <- runSD(ROC(Cl(GSPC)),10)
    GSPC$v21 <- runSD(ROC(Cl(GSPC)),21)
    GSPC$v63 <- runSD(ROC(Cl(GSPC)),63)
    GSPC$v252 <- runSD(ROC(Cl(GSPC)),252)

    # volatility horizon lags
    GSPC$l2 <- lag(GSPC$v2,-2)
    GSPC$l5 <- lag(GSPC$v5,-5)
    GSPC$l10 <- lag(GSPC$v10,-10)
    GSPC$l21 <- lag(GSPC$v21,-21)
    GSPC$l63 <- lag(GSPC$v63,-63)
    GSPC$l252 <- lag(GSPC$v252,-252)

    # volatility correlation table

    # remove missing observations
    GSPC <- na.omit(GSPC)

    # rolling 1-year volatility correlations
    GSPC$c2 <- runCor(GSPC$v2,GSPC$l2,252)
    GSPC$c5 <- runCor(GSPC$v5,GSPC$l5,252)
    GSPC$c10 <- runCor(GSPC$v10,GSPC$l10,252)
    GSPC$c21 <- runCor(GSPC$v21,GSPC$l21,252)
    GSPC$c63 <- runCor(GSPC$v63,GSPC$l63,252)
    GSPC$c252 <- runCor(GSPC$v252,GSPC$l252,252)

    # plot rolling 1-year volatility correlations
     main="Rolling 252-Day Volitility Correlations")

    Friday, April 9, 2010

    Maximum Probability of Profit

    To continue with the LSPM examples, this post shows how to optimize a Leverage Space Portfolio for the maximum probability of profit. The data and example are again taken from The Leverage Space Trading Model by Ralph Vince.

    These optimizaitons take a very long time. 100 iterations on a 10-core Amazon EC2 cluster took 21 hours. Again, the results will not necessarily match the book because of differences between DEoptim and Ralph's genetic algorithm and because there are multiple possible paths one can take through leverage space that will achieve similar results.

    The results from the EC2 run were:
    iteration: 100 best member: 0.0275 0 0.0315 -0.928 -1 best value: -0.9999
    The book results (on p. 173) were:
    iteration: 100 best member: 0.085 0.015 0.129 -0.76 -0.992 best value: -0.9999

    Specifying an initial population can give DEoptim an initial set of parameters that are within the constraint. This guarantees a starting point but it can slow optimization if the f (and/or z) values are too low. Therefore, experiment with the initial population to find a set of f (and/or z) values that produce a result within, but not far from, the constraint.

    # Load the LSPM and snow packages

    # Multiple strategy example (data found on pp. 84-87, 169)
    trades <- cbind(

    probs <- c(0.076923076923,0.076923076923,0.153846153846,0.076923076923,

    # Create a Leverage Space Portfolio object
    port <- lsp(trades,probs)

    # Number of population members
    np <- 30

    # Initial population
    initpop <- cbind(runif(np,0,0.01),runif(np,0,0.01),runif(np,0,0.01),

    # DEoptim parameters (see ?deoptim)
    DEctrl <- list(NP=np, itermax=11, refresh=1, digits=6, initial=initpop)

    # Create a socket cluster with snow to use both cores
    # on a dual-core processor
    cl <- makeSOCKcluster(2)

    # Drawdown-constrained maximum probability of profit (results on p. 173)
    res <- maxProbProfit(port, 1e-6, 12, probDrawdown, 0.1,
      DD=0.2, calc.max=4, snow=cl, control=DEctrl)

    Tuesday, March 30, 2010

    TTR_0.20-2 on CRAN

    An updated version of TTR is now on CRAN. It fixes a couple bugs and includes a couple handy tweaks. Here's the full contents of the CHANGES file:

    TTR version 0.20-2
    Changes from version 0.20-1

    • Added VWAP and VWMA (thanks to Brian Peterson)
    • Added v-factor generalization to DEMA (thanks to John Gavin)

    • Updated volatility() to handle univariate case of calc='close' (thanks to Cedrick Johnson)
    • Moved EMA, SAR, and wilderSum from .Fortran to .Call and used xts:::naCheck in lieu of TTR's NA check mechanism
    • RSI up/down momentum now faster with xts (thanks to Jeff Ryan)
    • If 'ratio' is specified in EMA but 'n' is missing, the traditional value of 'n' is approximated and returned as the first non-NA value (thanks to Jeff Ryan)

    • Fix to stoch() when maType is a list and 'n' is not set in the list's 3rd element (thanks to Wind Me)
    • Fixed fastK in stoch() when smooth != 1
    • Fixed segfault caused by EMA when n < NROW(x) (thanks to Douglas Hobbs)
    • test.EMA.wilder failed under R-devel (thanks to Prof Brian Ripley)

    Saturday, February 6, 2010

    Updated Tactical Asset Allocation Results

    In November, I used the strategy in Mebane Faber's Tactical Asset Allocation paper to provide an introduction to blotter. Faber has updated the strategy's results through the end of 2009. For those interested, he expands on the paper in his book, The Ivy Portfolio.

    Friday, February 5, 2010

    R/Finance 2010: Registration Open

    As posted by Dirk Eddelbuettel on R-SIG-Finance:

    R / Finance 2010: Applied Finance with R
    April 16 & 17, Chicago, IL, US

    The second annual R / Finance conference for applied finance using R, the premier free software system for statistical computation and graphics, will be held this spring in Chicago, IL, USA on Friday April 16 and Saturday April 17.

    Building on the success of the inaugural R / Finance 2009 event, this two-day conference will cover topics as diverse as portfolio theory, time-series analysis, as well as advanced risk tools, high-performance computing, and econometrics. All will be discussed within the context of using R as a primary tool for financial risk management and trading.

    Invited keynote presentations by Bernhard Pfaff, Ralph Vince, Mark Wildi and Achim Zeileis are complemented by over twenty talks (both full-length and 'lightning') selected from the submissions.  Four optional tutorials are also offered on Friday April 16.

    R / Finance 2010 is organized by a local group of R package authors and community contributors, and hosted by the International Center for Futures and Derivatives [ICFD] at the University of Illinois at Chicago.

    Conference registration is now open. Special advanced registration pricing is available, as well as discounted pricing for academic and student registrations.

    More details and registration information can be found at the website at


    For the program committee:

        Gib Bassett, Peter Carl, Dirk Eddelbuettel, John Miller,
        Brian Peterson, Dale Rosenthal, Jeffrey Ryan

    I hope to meet some of you there!

    Sunday, January 10, 2010

    LSPM with snow

    My last post provided examples of how to use the LSPM package. Those who experimented with the code have probably found that constrained optimizations with horizons > 6 have long run-times (when calc.max >= horizon).

    This post will illustrate how the snow package can increase the speed of the probDrawdown and probRuin functions on computers with multiple cores. This yields nearly linear improvements in run-times relative to the number of cores. (Improvements are nearly linear because there is overhead in setting up the cluster and communication between the nodes.)

    The first optimization takes 346 seconds on my 2.2Ghz Centrino, while the second optimization (with snow) takes 193 seconds... nearly a 45% improvement.

    # Load the libraries

    # Create a Leverage Space Portfolio object
    trades <- cbind(
    c(533,220.14,220.14,-500,533,220.14,799,220.14,-325,220.14,533,220.14) )
    probs <- c(rep(0.076923077,2),0.153846154,rep(0.076923077,9))
    port <- lsp(trades,probs)

    # Optimization using one CPU core
    res1 <- optimalf(port,probDrawdown,0.1,DD=0.2,horizon=5,control=list(NP=30,itermax=100))

    # Create snow socket cluster for both cores
    clust <- makeSOCKcluster(2)

    # Optimization using both CPU cores
    res2 <- optimalf(port,probDrawdown,0.1,DD=0.2,horizon=5,snow=clust,control=list(NP=30,itermax=100))

    # Stop snow cluster

    Saturday, January 2, 2010

    LSPM Examples

    I have received several requests for additional LSPM documentation over the past couple days and a couple months ago I promised an introduction to LSPM.

    In this long-overdue post, I will show how to optimize a Leverage Space Portfolio with the LSPM package.  Please use the comments to let me know what you would like to see next.

    Some copious notes before we get to the code:

    These examples are based on revision 26 31 from r-forge and will not work under earlier revisions (and may not work with later revisions). LSPM is still in very alpha status.  Expect things to change, perhaps significantly.

    These examples were run using DEoptim_1.3-3 (and LSPM revision 26 depends on that version) code from DEoptim_1.3-3 that has been bundled inside LSPM.  We are working with the DEoptim authors to address issues with more recent versions of DEoptim.  LSPM will un-bundle and use the most recent version of DEoptim as soon as the issues are resolved.

    The first two examples are taken from Vince, Ralph (2009). The Leverage Space Trading Model. New York: John Wiley & Sons, Inc. The results will not match the book because of differences between optimization via DEoptim and Ralph's genetic algorithm implementation.  Ralph believes his genetic algorithm is getting hung up on a local maximum, whereas DEoptim is closer to the global solution.

      # Load the LSPM package

      # Multiple strategy example (data found on pp. 84-87)
      trades <- cbind(
       c(533,220.14,220.14,-500,533,220.14,799,220.14,-325,220.14,533,220.14) )
      <- c(rep(0.076923077,2),0.153846154,rep(0.076923077,9))

      # Create a Leverage Space Portfolio object
      port <- lsp(trades,probs)

      # DEoptim parameters (see ?DEoptim)
      # NP=30        (10 * number of strategies)
      # itermax=100  (maximum number of iterations)
      <- list(NP=30,itermax=100)

      # Unconstrainted Optimal f (results on p. 87)
      <- optimalf(port,control=DEctrl)

      # Drawdown-constrained Optimal f (results on p. 137)
      # Since horizon=12, this optimization will take about an hour
      <- optimalf(port,probDrawdown,0.1,DD=0.2,horizon=12,calc.max=4,control=DEctrl)

      # Ruin-constrained Optimal f
      <- optimalf(port,probRuin,0.1,DD=0.2,horizon=4,control=DEctrl)

      # Drawdown-constrained Optimal f
      <- optimalf(port,probDrawdown,0.1,DD=0.2,horizon=4,control=DEctrl)