Archives for category: Uncategorized
Quantum annealing vs. simulated annealing

Quantum annealing vs. simulated annealing

Probabilistic search heuristics such as simulated annealing (SA) are very useful when the search space is extremely large or when the search landscape is non-convex such that basic hill climbing algorithms will tend to get stuck in a local minima.

Whilst SA works well for a variety of problems which are non-convex, quantum annealing can prove to be better for problems where the search landscape is extremely jagged with high variance of values. As SA depends on the temperature of the system to determine transition probabilities to a worse solution, this can be problematic for search spaces where there may be extreme variance in energy between neighboring inputs.

A good example of such a problem is in finding the value of D in minimal solutions of X for which the largest X is generated as a solution to Pell’s equation (specialised quadratic Diophantine equations for which c = 1 in this case). To give an example of the jaggedness of the energy landscape for maximising X, consider the minimal X solution for D equal to 61 and 62. For D = 61, X = 1766319049, while for D = 62, X is only 63. This extremely large variance would even prove troublesome for SA’s transition probabilities function. The key benefit of quantum annealing is that its transition probability function also factors in the width of hill in addition to its height. If the hill is thin enough (in this case it is), the idea is that simulated quantum fluctuations could bring the system out of a shallow local minima.

Below is a barebones implementation in Python for finding the maximum X in minimal solutions for D ≤ 1000. Further improvements could be made to the annealing schedule or adapting the field strength decay rate with respect to the number of iterations run.


import math
import numpy as np
import time

""" Energy Function """

def compute_energy(D):
	""" Takes continued fractions representation of D for integer solutions """
	m = 0
	d = 1
	a_0 = math.floor(D ** 0.5)
	a = a_0
	p_n_1 = 1
	p_n = a_0
	q_n_1 = 0
	q_n = 1
	period = 0
	""" Period finding for continued fraction representation of D"""
	while a != 2 * a_0:
		m = d * a - m
		d = (D - m * m) / d
		a = math.floor((a_0 + m) / d)
		# Recurrence relation for computing convergents
		p_n_2, p_n_1 = p_n_1, p_n
		q_n_2, q_n_1 = q_n_1, q_n
		p_n = a * p_n_1 + p_n_2
		q_n = a * q_n_1 + q_n_2
		period += 1
	""" If period - 1 is odd, then return p_n_1, else iterate period - 1 more times """	
	if (period - 1) % 2 == 1:
		return -p_n_1
	elif (period - 1) % 2 == 0:
		if period - 1 == 0:
			return -p_n
		else:
			for _ in xrange(period - 1):
				m = d * a - m
				d = (D - m * m) / d
				a = math.floor((a_0 + m) / d)
				p_n_2, p_n_1 = p_n_1, p_n
				q_n_2, q_n_1 = q_n_1, q_n
				p_n = a * p_n_1 + p_n_2
				q_n = a * q_n_1 + q_n_2
			return -p_n

""" Quantum Annealing """

def quantum_annealing(start_field_strength, field_strength_decay, width, search_space, max_iter):
	start_time = time.time()
	initial_D = search_space[np.random.randint(1, len(search_space))]
	initial_energy = compute_energy(initial_D)
	iter = 0
	while iter < max_iter:
		new_D = search_space[np.random.randint(1, len(search_space))]
		new_energy = compute_energy(new_D)
		if new_energy < initial_energy:
			initial_energy = new_energy
			initial_D = new_D
			start_field_strength *= (1 - field_strength_decay)
		elif new_energy == initial_energy:
			start_field_strength = start_field_strength * (1 - field_strength_decay / 5)
		elif new_energy > initial_energy:
			if np.exp(-((new_energy - initial_energy) ** 0.5) * width / start_field_strength) > np.random.random():
				initial_energy = new_energy
				initial_D = new_D
				start_field_strength *= (1 - field_strength_decay)
			else:
				start_field_strength = start_field_strength * (1 - field_strength_decay / 5)
		iter += 1
	return initial_D, time.time() - start_time

print "Running quantum annealing "

D_list = range(2, 1001)
D_squared = [x ** 2 for x in range(2, 32)]
D_list = [x for x in D_list if x not in D_squared]

counter = 1
while counter < 250:
	print quantum_annealing(100, 0.05, 5, D_list, 1500)
	counter += 1

XOR gate

XOR gate

Very interesting problem from Project Euler (59), made easier by the fact that the key is lowercase and 3 letters long. The encrypted message by cyclically repeating the 3 letter key through the message. Assuming spaces have not been removed in the plaintext message, a simple frequency analysis of the 3 batches of letters XOR’d with different letters in the key decrypts the cipher text quite quickly. Implementation in Python:

cipher = urllib2.urlopen("https://projecteuler.net/project/cipher1.txt")
cipher_text = cipher.read()
cipher_text = cipher_text.split(",")
cipher_text[-1] = cipher_text[-1][0:2]

""" Frequency Analysis """

key_base = string.lowercase
start_time = time.time()
letter_1 = []
letter_2 = []
letter_3 = []
space_ascii = ord(' ')
for letter in key_base:
	space_counter = 0
	for i in xrange(0, len(cipher_text), 3):
		if int(cipher_text[i]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_1.append(space_counter)
	space_counter = 0
	for j in xrange(1, len(cipher_text), 3):
		if int(cipher_text[j]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_2.append(space_counter)
	space_counter = 0
	for k in xrange(2, len(cipher_text), 3):
		if int(cipher_text[k]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_3.append(space_counter)

key = [ord(key_base[letter_1.index(max(letter_1))]), ord(key_base[letter_2.index(max(letter_2))]), ord(key_base[letter_3.index(max(letter_3))])]

""" Decryption """

decrypted_message = []
for i in xrange(len(cipher_text)):
	decrypted_message.append(int(cipher_text[i]) ^ key[i % 3])

ascii_sum = sum(decrypted_message)
finish_time = time.time() - start_time
Image

Energy landscape for digit sum of ab

Thought it’d be fun to try simulated annealing on the maximum digit sum for ab for a, b ≤ K. In the above diagram K = 100, hence giving a discrete search space of 10,000 values.

Though it is quite easy to brute force search the digit sum, simulated annealing offers a more scalable solution for large K. Below is an implementation in Python:


def digit_sum(number):
	s = 0
	while number:
		s += number % 10
		number /= 10
	return s 

def compute_energy(base, exponent):
	return -(digit_sum(base ** exponent))

def simulated_annealing(start_temp, cooling_rate, max_base, max_exponent, max_iter):
	initial_base = np.random.randint(1, max_base + 1)
	initial_exponent = np.random.randint(1, max_exponent + 1)
	initial_energy = compute_energy(initial_base, initial_exponent)
	iter = 0
	while iter < max_iter:
		new_base = np.random.randint(1, max_base + 1)
		new_exponent = np.random.randint(1, max_exponent + 1)
		new_energy = compute_energy(new_base, new_exponent)
		if new_energy < initial_energy:
			initial_energy = new_energy
			start_temp *= (1 - cooling_rate)
		elif new_energy == initial_energy:
			start_temp *= (1 - cooling_rate / 5)
		elif new_energy > initial_energy:
			if np.exp((initial_energy - new_energy) / start_temp) > np.random.random():
				initial_energy = new_energy
				start_temp *= (1 - cooling_rate)
			else:
				start_temp *= (1 - cooling_rate / 5)
		iter += 1
	return -(initial_energy)

Macchu Picchu

Macchu Picchu

It has been four months since I first arrived in Santiago, Chile. Up until Chile, I had never been to South America. It was always a continent that was relegated to my imaginations which drew inspiration from the rugged landscapes of the region I had seen from National Geographic as a child. Needless to say, my time spent in Chile and in other LatAm countries have been very fruitful in drawing personal lessons.

Never Stop Learning & Adapting

I have always been fascinated by the formation and downfall of great nations, more concretely the decisions and events which contributed to their predicaments. Though there are certain events in history which may have acted as a tipping point for the relative ascendancy of one nation to another, an Epicurean view of the world doesn’t do the power of human initiative and free will justice. Luck is an element, but moreover I believe it is specific human attitudes and decisions which play a much greater role in the fate of nations.

Travelling affords one an unbiased view into the formation process of nations, why some thrive and others stagnate. So the biggest reward for me when I travel is in drawing valuable lessons by viewing the history of the places I visit. Chile is one of those countries which throughout its more contemporary history has been gifted with natural resource discoveries, but in past instances have opted to squander them versus planning for the future. It is a country littered with examples of when a stable, isolated system is disrupted by an unstoppable external force.

Valparaiso

Valparaiso

Valparaiso, a port city on the coast of Chile is a favorite destination for me, given its relaxed atmosphere and its quaint similarities to San Francisco. But more importantly it is an anecdote for the continuous growth-and-decline cycle of companies, cities and nations.

Back in the 19th century, Valparaiso gained prominence as a major stopover port for ships rounding South America via the Strait of Magellan en route to the west coast of the U.S. or Asia. The economic prosperity of trade brought about Latin America’s first stock exchange and numerous other firsts in civil services. Yet the good times wouldn’t last long as the opening of the Panama Canal in 1914 dealt a severe blow for Valparaiso’s shipping volume. What is interesting is why Valparaiso never planned for the disruption given that the first construction attempt of the Panama Canal began in 1881. Why Valparaiso never adapted or made preparations for disruption is something I’m not prepared to answer at this stage, but the city does serve as a living reminder of the consequences when imagination fails and complacency takes over.

In some respects the influx of foreign entrepreneurs brought in by the Start-Up Chile program is at the very least a good external wake up call for Chile. Chilean society is still highly stratified with family name and where you live being a crude gauge of your position and power in society. As a foreigner, I was very much ignorant to this undercurrent of classism nor do I care much for it. It is my hope that myself and the other foreign entrepreneurs here can really catalyse the process of hacking away at this form of neo-serfdom.

The Chilean entrepreneurial scene is still very much a work in progress and a big hindrance to its long term success is the inherent classism that currently exists. In shifting the factors behind success and achievement in Chilean society from indeterminate probabilistic factors (e.g. family heritage, genetic luck) to determinate ones (hardwork, hustle, talent etc.), Chile can move one step closer to building a tech hub that contributes original and innovative companies. Otherwise it is highly likely to go down the road of 1-n globalisation, where companies are largely clones of existing and working ones in other parts of the world.

The World Is Truly Converging – In The Economic Sense

Vina del Mar

Vina del Mar

My first two month in South America were spent in Chile and to be quite honest, it didn’t feel like “South America”. The normal amenities of Starbucks, large shopping malls and Western brands are ever present in Chile to the extent that it becomes hard to distinguish the unique characteristics of the country.

I’ve had many instances in Chile, where I felt I could’ve just as easily been in San Francisco, New York, Hong Kong or any other major international hubs. Looking back at my childhood in the 90s, there was this real sense that the world was opening up, people were embracing multiculturalism and that the world was beginning to converge. During my time in Chile, I would still say that the convergence theory of globalisation still holds true, but only loosely in the economic sense.

On one hand it is nice to have the amenities that I’m used to be readily available in Chile, but it is also sad to see that my generation is all converging to the same brand of Western culture with little differentiation apart from the native tongue they speak. Whether this is a good thing or not really depends on the individual, but for me it is a little sad to see that we are losing the distinct characteristics that define who we are and our heritage.

Up until now I haven’t experienced much of Chilean culture, it is a country rushing to modernise and in some ways move on quickly from its ugly recent history. The country has done well economically with its free market policies and its large copper reserves, but whether it avoids another resource dependent boom-and-bust cycle really hinges on how well it plans for the future.

Between May and October 1648, a series of treaties were signed in Osnabruck and Munster respectively which ended the Thirty Years’ War in the Holy Roman Empire and the Eighty Years’ War between Spain and the Dutch Republic. What made the treaties of Westphalia important and notable was the establishment of the concept of a territorial sovereignty.

The introduction of the sovereign state signaled the end of perennial warfare in mainland Europe but also the triumph of sovereignty over empire. These set of treaties would lay the foundations for a large of modern international law. If one distills the fundamental change brought about from the Westphalia treaty to governance, one can see the coalescence of power as defined according to geographical bounds. In essence, Westphalia was a key force in demarcating the various peoples of Europe into territorial regions. It transferred governance from sovereignty over all those whom share similar ethnic traits to a more rigid system bound by defined geographical boundaries. Citizenship was no longer necessarily decreed based on ethnic lines, but now also in terms of the geographical location of the individuals.

Pre-Westphalia: Fluid territories and national boundaries defined as a function of settlement areas

Pre-Westphalia: Fluid territories and national boundaries defined as a function of settlement areas

The centralized power held by various ethnic rulers and emperors over all subjects of the same ethnic group was now controlled and tightly bound by pre-determined geographical bounds. Thus, Westphalia brought about a relative decentralization of the power held by rulers. As a result individuals free to determine the system of power and control they were subjected to, because mobility and emigration was made possible under the sovereign state framework.

Fast forward to today, the right to self-determination of the set of laws one is subjected to is all around us. Globalization and modern interracial societies (United States, United Kingdom) arguably trace their roots back to this legal framework. In paving the way for greater human mobility, Westphalia brought about a massive decentralization of political power, shifting it from traditional high priests and hereditary rulers to the general population.

Post-Westphalia: Concept of sovereign state established, fixed boundaries. More conducive for immigration

Post-Westphalia: Concept of sovereign state established, fixed boundaries. More conducive for immigration

A clear analogue to the decentralization of power by brought about by the Peace of Westphalia is the internet today. Traditional industries such as media which had held a monopoly on information are now being disrupted one by one by the power of the internet to connect and leverage the collective resources of a vast number of individuals. Think for a moment about the way in which the news industry has changed. Twenty years ago, news was very much top down. Breaking events were covered by news outlets, edited and then delivered to individual consumers. The power to collect and disseminate information lay in the hands of a small group of individuals. These individuals should they choose to had the ability to censor and manipulate given the lack of competition. Today, the advent of social platforms like Twitter and Facebook has given everyday users a channel to voice their opinions and their own observations of the world around them. The proliferation of mobile devices coupled with software and internet solutions has enabled many more individuals to publish and share their stories. By harnessing the collective power of localized reporters, the beauty in the current internet structure is the way in which it has created far more competition to the traditional news agencies.

The vast wealth of information stored online and the ability to connect with another individual across the world is hugely empowering. A couple of decades ago, formal education was largely the domain of the government and private universities, but the rise of MOOCs today threatens this traditional top-down one-size fits all structure of instruction. Whereas previous generation had a much more localized social experience, today we are able to connect and befriend individuals from the across the world. It is also a safe bet that we know someone half way across the world better than we know our neighbors.

It has been more than three centuries since the Peace of Westphalia. The internet today draws many parallels to the ways in which the peace treaty shaped civil liberties and individual freedoms. The internet has and will continue to chip away at the hegemony of traditional power structures in many areas from media to education.

At the core of the internet is its ability to decentralize power and offer tailored experiences to the individual. As more of the world comes online, the internet’s ability to connect and share ideas is incredibly exciting as a framework for how we should be re-evaluating the current legal framework that governs us. What does the ability to define identity and create groups of association via the web mean for definitions of citizenship? Could our current legal framework simply be a model for interfacing with the physical world, whilst a parallel framework develops on the internet? Or could the current legal framework be completely revamped?

2^8888

2^8888

Sometimes one takes on problems not for their practical purposes, but just for the sake of solving it. Since there is a limit on the number of digits that can be represented in a language like Javascript, I wanted to compute the actual output for two to a very large exponent.

One effective means of doing so is to compute 2^n and store the resulting digits in a list. The Javascript below completes this action:

var math = require('mathjs');
var log = console.log;

var sum = function(as) {
        var cc = 0;
        for(ii in as) {
                cc += as[ii];
        }
        return cc;
};

var power = function(base, k) {
        var output = [base];
        var temp = [];
        var init = 1;
        var index = 0;
        while(init < k) {
                for(ii in output) {
                        output[ii] *= 2;
                }
                if(output.length === 1 && output[0] >= 10) {
                        output.push(1);
                        output[0] = output[0] + '';
                        temp = output[0].split("")[1];
                        output[0] = Number(temp);
                }
                if(output.length === 2) {
                        if(output[0] >= 10) {
                                output[1] += 1;
                                output[0] = output[0] + '';
                                temp = output[0].split("")[1];
                                output[0] = Number(temp);
                        }
                        if(output[1] >= 10) {
                                output.push(1);
                                output[1] = output[1] + '';
                                temp = output[1].split("")[1];
                                output[1] = Number(temp);
                        }
                }
                if(output.length > 2) {
                        for(ii=0; ii <= (output.length - 2); ii++){
                                if(output[ii] >= 10) {
                                        output[ii+1] += 1;
                                        output[ii] = output[ii] + '';
                                        temp = output[ii].split("")[1];
                                        output[ii] = Number(temp);
                                }
                        }
                        index = output.length - 1;
                        if(output[index] >= 10) {
                                output[index] = output[index] + '';
                                temp = output[index].split("")[1];
                                output[index] = Number(temp);
                                output.push(1);
                        }
                }
                init += 1;
        }
        return output;
}

var agg = function(as) {
        var output = as[as.length - 1].toString();
        for(ii = (as.length - 2); ii >= 0; ii--) {
                as[ii] = as[ii].toString()
                output = output + as[ii];
        }
        return output;
};

Using this script, we can compute 2^8888 (output in image) which has a digit sum of 12055.

Collatz sequence stopping time for first 1,000 integers

Collatz sequence stopping time for first 1,000 integers

The Collatz sequence is defined by the following rule:

n = n/2 if n is even
n = 3n + 1 if n is odd

Based on this rule and starting with 20; the following sequence is generated:

20 – 10 – 5 – 16 – 8 – 4 – 2 – 1


var math = require('mathjs');
var log = console.log;

var integer = function(n) {
        var output = [];
        for(ii = 1; ii <= n; ii++) {
                output.push(ii);
        }
        return output;
};

var collatz = function(k) {
        var output = [k];
        var n = k;
        while(n > 1) {
                if(n % 2 === 0) {
                        n /= 2;
                        output.push(n);
                } else {
                        n = (3*n + 1);
                        output.push(n);
                }
        }
        return output.length;
}

var colsearch = function(j) {
        var intlist = integer(j);
        var output = [];
        for(ii in intlist) {
                output.push(collatz(intlist[ii]));
        }
        return output;
}

var isearch = function(as) {
        var output = [];
        var max = math.max(as);
        var acc = 0;
        for(ii in as) {
                if(as[ii] === max) {
                        acc = Number(ii) + 1;
                        output.push(acc);
                }
        }
        return output;
};

Using this code the starting number up to 1 million which produces the longest Collatz sequence is 837799

I recently got hold of the first 50 million prime numbers and decided to play around with this dataset.Yiting Zhang’s recent work on the distribution of primes piqued my interest in visualising the consecutive differences between primes.

I decided to use a random walk procedure where the initialising direction is randomly chosen from a unit vector selection of {left, right, up or down}. The first step is then randomly chosen from one of the two directions orthogonal to this initialising vector. The magnitude is determined by the prime difference at each step.

Below are some figures for different numbers of prime differences:

2D random walk for (1 million - 1) prime differences

2D random walk for (1 million – 1) prime differences

2D random walk for (20 million - 1) prime differences

2D random walk for (20 million – 1) prime differences

2D random walk for (50 million - 1) prime differences

2D random walk for (50 million – 1) prime differences

Here’s the R code for the random walk procedure:

step = matrix(c(0,1,0,-1,1,0,-1,0),nrow=2,ncol=4)

x_pos = rep(0, length(primeVec))
y_pos = rep(0, length(primeVec))
move_initial = step[,sample(c(1,2,3,4),size=1)]
for(i in 2:length(primeVec)){
	index = which(move_initial%*%step==0)
	move_new = step[,sample(index,size=1)]	
	dx = move_new[1]*diff[i-1]
	dy = move_new[2]*diff[i-1]
	x_pos[i] = x_pos[i-1] + dx
	y_pos[i] = y_pos[i-1] + dy
	move_initial = move_new
}
time = Sys.time()
plot(x_pos,y_pos,type='l')
legend("topright",col=1,lty=1,legend=time,bty="n",cex=0.8)

The figures above are all quite unique and this increases as the number of steps taken increases. I’ve decided to time-stamp each of the three figures as it is highly unlikely that these exact figures are to appear again. More concretely the probability of each individual sequence occurring is 0.25x(0.5)^(n-1) where n is the number of prime differences.

At a higher level, the highly stochastic nature of these random walks draws some very interesting parallels with the indeterminate nature of human life. The everyday and mundane (work routine, observations on way to get coffee) that we take for granted may not be so when you consider that they may not appear in that exact form again in your lifetime.

.

Fragility

The economic history of the last century has been marked by sinusoidal swings between government fine tuning of the economy and free markets. The triumph of war time planning had brought Keynesian economics on to the world stage. Today we see a re-emergence of Keynesian ideas, though with markedly different outcomes.

The trusted tools of conventional monetary policy that have been used for over 70 have seemingly had very little effect. Lowering interest rates decreases the cost of borrowing for agents in the economy and in practice does stimulate growth, however such policies may have limited effects for highly leveraged countries. A 2009 paper published by Rogoff and Reinhart found that empirically, countries with a debt to GDP ratio of greater 90% face a 1% lower median growth rate. Clearly the circumstance under which debt is accumulated is critical to understanding the effectiveness of fiscal and monetary policies.

Central bank balance sheet as a percentage of GDP

Central bank balance sheet as a percentage of GDP

War-time debts are less problematic given the transfer of organized labour to the civilian economy and the cessation of war spending as hostilities cease. Excessive debt accumulated under peace time can be harmful to long term growth prospects. The Federal Reserve, Bank of England, ECB and the Bank of Japan have significantly increased their balance sheets since the summer of 2007. Whilst a large amount of the initial balance sheet increase was attributed to direct investments in companies and the purchase of toxic assets, the recent growth of the balance sheet is primarily from the use of unconventional monetary policies devised to work around the zero bound on nominal interest rates. Credit easing, quantitative easing, and signaling are the new tools which central bankers are using to bring life back to the global economy. In theory the outright purchase of long dated securities to flatten the yield curve should promote savers to investment their cash into the economy.

To date it is arguable how effective these policies have been. Whilst major global equity indices have rallied back to their pre-crisis levels, there is a disconnect between financial asset performance and the real economy. Unemployment in the U.S. has remained stagnant below the 8% mark (refer to previous page chart), while the situation is far worse in Eurozone. This disconnect could be indicative of either an investor preference to allocate cash to financial markets versus traditional job generating investments in hard assets, or the limited effects of monetary policy given a series of debt related crises around the world. I believe it is a mix of both of these factors which have stunted the effects of expansionary monetary policy. Central bank balance sheets and national debt levels have grown to such an extent that they pose very credible long term dangers to the global economy.

Major global indices have recovered; different story with unemployment

Major global indices have recovered; different story with unemployment

In an environment where asset prices have been inflated due to record low interest rates, there are very few asset classes which remain undervalued. Moreover, the notion of being long in an asset that is undervalued will become increasingly difficult to justify. If anything, the dangers of advanced economy leverage poses more risk to inflated asset prices. In this environment, it is much easier to look for the low hanging fruit which run contrarian to recent market trends. A very attractive asset class which offers asymmetric payoffs is credit protection and more specifically, Japanese credit default swaps (CDS)

Japan has for years accumulated significant debts in an attempt to stimulate its economy through both fiscal and monetary policy. From 1990 to fiscal 2013 outstanding government debt grew from 166 trillion to 750 trillion yen (excluding FILP bonds ). This represents an annual growth of approximately 7%.

Outstanding JGB and interest payments as a percentage of government tax revenue

Outstanding JGB and interest payments as a percentage of government tax revenue

Meanwhile, government tax revenue has fallen sharply from their peak in 1990 of 60.1 trillion to 42.6 trillion yen as of 2012. Whilst interest payments remained largely stable in the 1990s and dropped during the first decade of the new millennium, this was largely due to the successive lowering of official cash rates which allowed the Japanese government to refinance its debt. As of January 31st 2013, the year 10Y Japanese Government Bond (JGB) yield stood at 76.3 basis points. Shorter tenure bonds such as the five year JGB yielded 15.2 basis points.

JGB 10Y spot yield

JGB 10Y spot yield

While monetary policy has allowed Japan to refinance its government debt at historically low rates, its interest to tax ratio has slowly increased over the last two years. When more than 20% of expected government tax revenue is to be spent on servicing debt there should be serious concerns over the sustainability of adding additional debt. When a country has reached such a high level of debt, the answer to growth is not to add on debt, but either growth to eliminate it or a sovereign restructuring. Unfortunately for Japan, it is also facing an uphill battle in terms of its ability to grow.

Japan faces a rapidly aging population as the baby-boomer generation go into retirement. Population growth for the country has steadily declined since the mid-1970s and is today barely above 0% per annum. What this implies for the economy is an increasing amount of pension obligations and laborers exiting the workforce with a much smaller force of young workers coming in to fill the gaps. Add to this a racially homogeneous country which despite increasing average annual immigration in the last decade has seen net migration of less than half a million between 2005 and 2010. The demographic trends for Japan highlight an obvious issue with mounting pension obligations and a lack of young workers to grow the economy.

Proportion of population over 65 (LHS) and net population growth rate (RHS)

Proportion of population over 65 (LHS) and net population growth rate (RHS)

Immigration (LHS) and net immigration (RHS)

Immigration (LHS) and net immigration (RHS)

On top of all these fundamental issues that still plague the Japanese economy, the latest move by the Bank of Japan to adopt a 2% inflation target as well as a commitment monthly purchases of 13 trillion yen of financial assets poses additional dangers to an already fragile Japanese economy. The reason being the signaling effect that such a measure should in theory have in the bond market and secondly the further growth of the Bank of Japan’s balance sheet.

Since 2009, Japan has experienced average annual deflation which has given JGB holders a higher real interest rate. The issue with raising inflation expectations to 2% is that in theory bondholders should demand a higher cash coupon on new JGB issuances to compensate for the inflation. While this has phenomena has yet to occur in the JGB market, the BOJ’s announcement has had significant effects in devaluing the yen.

A further look into the ownership structure of outstanding JGBs reveals some technical factors which may have hindered a rise in yields in the JGB market, but which also point to additional concerns. As of Q3 2012, foreign owners of JGB stood at only 9.1%, whilst domestic banks, life insurance companies and pension funds owned more than 70% of the outstanding JGB stock. The huge domestic ownership has allowed the Japanese government to continue monetizing its deficit without too much concern for capital flight. This however has created a very dangerous feedback loop for the government to continue piling on debt.

Outstanding JGB holdings

Outstanding JGB holdings

Should the Abe government attempt to inflate away its debt obligations, it will run into serious issues with future deficit financing. While the average maturity of newly issued JGBs have steadily increased in the last three years, the Japanese government runs the risk of lowering average JGB issuance maturity due to the inflation risk posed by its policies. Moreover the delicate balance held together by domestic JGB holders will not last forever. Domestic pension and insurance companies will run into financing issues when their nominal income from their JGB holdings will not be able to meet inflated pension and claims payouts.

Average JGB issuance maturity

Average JGB issuance maturity

It is perhaps surprising that Japanese five year CDS spreads have in fact tightened since February 2012. Assuming a 60% default recovery rate, Deutsche Bank data suggests an implied annual default probability 1.7% as of Feb 6th 2013, which is significantly lower the same period a year ago despite continued debt accumulation and low growth prospects.

JGB 5Y CDS spread (LHS) & implied default (RHS)

JGB 5Y CDS spread (LHS) & implied default (RHS)

In this era of unprecedented central bank easing, asset inflation has caused a disconnect between financial markets and the real economy. Less talked about are the dangerous levels of debt being accumulated by the major advanced economies. Japan is an example of the limitations of Keynesian expansionary policies and Japanese CDS represent a greatly undervalued instrument given the country’s leverage and growth outlook.

San Francisco mapped through geotagging

Internet Through The Ages

The launch of the Mosaic browser by Marc Andreessen in 1993 paved the way for Netscape which announced its browser in late 1994. The World Wide Web would fundamentally change the way global information was structured and accessed. Credit Suisse saw early on the benefits of the internet in changing the way retail clients brought and sold shares. The incorporation of Donaldson, Lufkin & Jenrette’s online brokerage business DLJDirect was a sign of the ways in which traditional financial brokerage functions were going online and become electronic.

The fundamental difference between Web 1.0 and today’s Web 2.0 era is the establishment of social media platforms that have allowed for user generated content and greater online social interaction. The internet today is not just seen as a research tool for one to access information, but has also become a place for the dissemination of user generated content. Thus, in the absolute the biggest change that web 2.0 social has brought is a key decentralization in the ways in which information is generated and shared. This distributed internet architecture today has led to an explosion of user data, which holds immense opportunities to revolutionize the core businesses for various aspects of the financial services industry.

Data Explosion

Today’s social media platforms have benefitted from network effects derived from two structural changes in internet usage. Growing penetration of high speed internet in developed and developing countries have increased exponentially in the last decade and this trend has been complemented by an increased willingness to put authentic personal information online. According to the Cisco VNI index, monthly internet traffic will expand fourfold globally from 2010 to 2015, while the number of smartphones globally could well hit five billion by 2015 according to Marc Andreessen. Growth of the internet from the first internet bubble to today has been immense and remains strong going into the remainder of this decade.

The result of these two structural shifts in internet usage combined with smartphone penetration has led to an explosion of personal data. Social media sites like Facebook or Twitter have been immensely powerful in convincing people to willingly hand over their personal information, friendship associations, interests, as well visual identifications of themselves and friends. This treasure trove of data is immensely valuable.

As heavier regulation comes down on more risky securities businesses, core commercial lending businesses will take on more importance in revenue contribution for universal and commercial banks. Better Bayesian prior updates have the potential to revolutionize the traditionally rigid commercial banking services to customized services for more nuanced customer segments. Hunting for signals of particular customer needs can now be done through large scale data mining on social media portals. For instance, a client tweets a photos of an apartment that she has inspected for purchase to ask for feedback. This information once picked up can be used for targeted marketing of home loan services to this particular client.

The realization that social media is an intelligence platform for financial institutions is crucial. Better intelligence gathering leads to more robust competitive strategies, but also leads to quicker product feedback. Setting up social media profiles is one step, but machine learning and data science specialists must be employed to fully capture the richness of personal information and opinions disseminated through social media.

Data Driven Decision Making

As proprietary trading is being clamped down for universal and investment banks, flow trading for clients will become increasingly important. The distributed architecture of the internet today means that a lot more people can become citizen journalists reporting on events around them. The sheer size of the global social graph means that this information has the ability to reach a wide variety of people and companies – including trading desks.

The ability now for everyday citizens to disseminate information has to some extent been very disruptive to traditional media which have relied on a limited number of journalists to cover a wide range of events. Oftentimes, natural disasters or unexpected events are more quickly reported by direct witnesses through social media platforms like Facebook or Twitter. A notable example of such social media use was during the Egyptian revolution of 2011, where citizens were able to paint a clearer picture of the unrest through localized social media reporting. This information when aggregated can be of immense value to flow trading desks. For instance, a hypothetical confrontation in the Strait of Hormuz reported by eyewitnesses on Twitter could hit the internet quicker than the reporting from various news agencies. Given the electronic execution of today’s trading desks, an integrated social media listening tool could be utilized to gain an information advantage and thus execute trades for clients on more favorable market terms.

Similarly, such an information advantage could also be exploited to generate trading signals for investment managers. The concept of using the immense amount of information available is not unfamiliar to investment banks. For instance, UBS has used satellite images of Wal-Mart store parking lots to get a more accurate estimate on quarterly earnings. An obvious extension of such intelligence gathering techniques is to the realm of social media. A string of time-stamped Twitter updates on a new product launch could be aggregated to get an idea of the perception of a new product amongst a diverse group people. This information could then be used to make a decision on the position to take for the given company. This is especially relevant given that more people are covering events via Twitter. Social media thus provides in some cases an information advantage, but also interesting signals that could be utilized by sell-side flow trading desks, but also independent proprietary desks at hedge funds and asset managers.

Crowdfunding – The Future?

For every day citizens, the capital raising process is something very esoteric and relegated to the ranks of financial services firms and institutional investors. The distributed internet architecture has made the prospect of crowdfunding large private or government projects a possibility.

As developed cities face an increasing financial strain to repair or build new infrastructure, crowdfunded solutions could be the solution. Rather than raise taxes, small infrastructure repairs or upgrades could be hosted on a crowdfunding site owned by a securities firm. This solution provides inherently more clarity in the end use of a constituent’s capital contribution versus paying taxes to the city or state government. A hypothetical scenario would be the ability for citizens to upload visual images of derelict sites in their neighborhood to a crowdfunding platform. Citizens whom share a similar interest in the infrastructure overhaul could contribute online to such a project, where the pooled funds are overseen by a securities firm which then transfers this crowdfunded pool of capital to the city council to implement the necessary repairs. This is one example of possible crowdfunding in the public finance space, there may be other very interesting applications. The key takeaway is that as these low hanging infrastructure projects and repairs are moved to the crowdfunding platform, city planners can focus on allocating capital to bigger projects given their tighter budgets.

In the U.S., the JOBS Act is a bill that could potentially open up even large scale statewide projects to citizen investors who will be allowed to invest up to 10,000 USD online. Though initially for startup funding, it is conceivable that infrastructure projects could soon be allowed to be partially funded through such a platform.

Though crowdfunding sites like Kickstarter or Indiegogo exist, the backing of a global financial adds credibility. The entrance of financial services firms to crowdfunding is a complement to existing donation based sites and would provide their institutional clients a new source of potential investors.