Quantum annealing vs. simulated annealing

Quantum annealing vs. simulated annealing

Probabilistic search heuristics such as simulated annealing (SA) are very useful when the search space is extremely large or when the search landscape is non-convex such that basic hill climbing algorithms will tend to get stuck in a local minima.

Whilst SA works well for a variety of problems which are non-convex, quantum annealing can prove to be better for problems where the search landscape is extremely jagged with high variance of values. As SA depends on the temperature of the system to determine transition probabilities to a worse solution, this can be problematic for search spaces where there may be extreme variance in energy between neighboring inputs.

A good example of such a problem is in finding the value of D in minimal solutions of X for which the largest X is generated as a solution to Pell’s equation (specialised quadratic Diophantine equations for which c = 1 in this case). To give an example of the jaggedness of the energy landscape for maximising X, consider the minimal X solution for D equal to 61 and 62. For D = 61, X = 1766319049, while for D = 62, X is only 63. This extremely large variance would even prove troublesome for SA’s transition probabilities function. The key benefit of quantum annealing is that its transition probability function also factors in the width of hill in addition to its height. If the hill is thin enough (in this case it is), the idea is that simulated quantum fluctuations could bring the system out of a shallow local minima.

Below is a barebones implementation in Python for finding the maximum X in minimal solutions for D ≤ 1000. Further improvements could be made to the annealing schedule or adapting the field strength decay rate with respect to the number of iterations run.


import math
import numpy as np
import time

""" Energy Function """

def compute_energy(D):
	""" Takes continued fractions representation of D for integer solutions """
	m = 0
	d = 1
	a_0 = math.floor(D ** 0.5)
	a = a_0
	p_n_1 = 1
	p_n = a_0
	q_n_1 = 0
	q_n = 1
	period = 0
	""" Period finding for continued fraction representation of D"""
	while a != 2 * a_0:
		m = d * a - m
		d = (D - m * m) / d
		a = math.floor((a_0 + m) / d)
		# Recurrence relation for computing convergents
		p_n_2, p_n_1 = p_n_1, p_n
		q_n_2, q_n_1 = q_n_1, q_n
		p_n = a * p_n_1 + p_n_2
		q_n = a * q_n_1 + q_n_2
		period += 1
	""" If period - 1 is odd, then return p_n_1, else iterate period - 1 more times """	
	if (period - 1) % 2 == 1:
		return -p_n_1
	elif (period - 1) % 2 == 0:
		if period - 1 == 0:
			return -p_n
		else:
			for _ in xrange(period - 1):
				m = d * a - m
				d = (D - m * m) / d
				a = math.floor((a_0 + m) / d)
				p_n_2, p_n_1 = p_n_1, p_n
				q_n_2, q_n_1 = q_n_1, q_n
				p_n = a * p_n_1 + p_n_2
				q_n = a * q_n_1 + q_n_2
			return -p_n

""" Quantum Annealing """

def quantum_annealing(start_field_strength, field_strength_decay, width, search_space, max_iter):
	start_time = time.time()
	initial_D = search_space[np.random.randint(1, len(search_space))]
	initial_energy = compute_energy(initial_D)
	iter = 0
	while iter < max_iter:
		new_D = search_space[np.random.randint(1, len(search_space))]
		new_energy = compute_energy(new_D)
		if new_energy < initial_energy:
			initial_energy = new_energy
			initial_D = new_D
			start_field_strength *= (1 - field_strength_decay)
		elif new_energy == initial_energy:
			start_field_strength = start_field_strength * (1 - field_strength_decay / 5)
		elif new_energy > initial_energy:
			if np.exp(-((new_energy - initial_energy) ** 0.5) * width / start_field_strength) > np.random.random():
				initial_energy = new_energy
				initial_D = new_D
				start_field_strength *= (1 - field_strength_decay)
			else:
				start_field_strength = start_field_strength * (1 - field_strength_decay / 5)
		iter += 1
	return initial_D, time.time() - start_time

print "Running quantum annealing "

D_list = range(2, 1001)
D_squared = [x ** 2 for x in range(2, 32)]
D_list = [x for x in D_list if x not in D_squared]

counter = 1
while counter < 250:
	print quantum_annealing(100, 0.05, 5, D_list, 1500)
	counter += 1

XOR gate

XOR gate

Very interesting problem from Project Euler (59), made easier by the fact that the key is lowercase and 3 letters long. The encrypted message by cyclically repeating the 3 letter key through the message. Assuming spaces have not been removed in the plaintext message, a simple frequency analysis of the 3 batches of letters XOR’d with different letters in the key decrypts the cipher text quite quickly. Implementation in Python:

cipher = urllib2.urlopen("https://projecteuler.net/project/cipher1.txt")
cipher_text = cipher.read()
cipher_text = cipher_text.split(",")
cipher_text[-1] = cipher_text[-1][0:2]

""" Frequency Analysis """

key_base = string.lowercase
start_time = time.time()
letter_1 = []
letter_2 = []
letter_3 = []
space_ascii = ord(' ')
for letter in key_base:
	space_counter = 0
	for i in xrange(0, len(cipher_text), 3):
		if int(cipher_text[i]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_1.append(space_counter)
	space_counter = 0
	for j in xrange(1, len(cipher_text), 3):
		if int(cipher_text[j]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_2.append(space_counter)
	space_counter = 0
	for k in xrange(2, len(cipher_text), 3):
		if int(cipher_text[k]) ^ ord(letter) == space_ascii:
			space_counter += 1
	letter_3.append(space_counter)

key = [ord(key_base[letter_1.index(max(letter_1))]), ord(key_base[letter_2.index(max(letter_2))]), ord(key_base[letter_3.index(max(letter_3))])]

""" Decryption """

decrypted_message = []
for i in xrange(len(cipher_text)):
	decrypted_message.append(int(cipher_text[i]) ^ key[i % 3])

ascii_sum = sum(decrypted_message)
finish_time = time.time() - start_time
Image

Energy landscape for digit sum of ab

Thought it’d be fun to try simulated annealing on the maximum digit sum for ab for a, b ≤ K. In the above diagram K = 100, hence giving a discrete search space of 10,000 values.

Though it is quite easy to brute force search the digit sum, simulated annealing offers a more scalable solution for large K. Below is an implementation in Python:


def digit_sum(number):
	s = 0
	while number:
		s += number % 10
		number /= 10
	return s 

def compute_energy(base, exponent):
	return -(digit_sum(base ** exponent))

def simulated_annealing(start_temp, cooling_rate, max_base, max_exponent, max_iter):
	initial_base = np.random.randint(1, max_base + 1)
	initial_exponent = np.random.randint(1, max_exponent + 1)
	initial_energy = compute_energy(initial_base, initial_exponent)
	iter = 0
	while iter < max_iter:
		new_base = np.random.randint(1, max_base + 1)
		new_exponent = np.random.randint(1, max_exponent + 1)
		new_energy = compute_energy(new_base, new_exponent)
		if new_energy < initial_energy:
			initial_energy = new_energy
			start_temp *= (1 - cooling_rate)
		elif new_energy == initial_energy:
			start_temp *= (1 - cooling_rate / 5)
		elif new_energy > initial_energy:
			if np.exp((initial_energy - new_energy) / start_temp) > np.random.random():
				initial_energy = new_energy
				start_temp *= (1 - cooling_rate)
			else:
				start_temp *= (1 - cooling_rate / 5)
		iter += 1
	return -(initial_energy)

Macchu Picchu

Macchu Picchu

It has been four months since I first arrived in Santiago, Chile. Up until Chile, I had never been to South America. It was always a continent that was relegated to my imaginations which drew inspiration from the rugged landscapes of the region I had seen from National Geographic as a child. Needless to say, my time spent in Chile and in other LatAm countries have been very fruitful in drawing personal lessons.

Never Stop Learning & Adapting

I have always been fascinated by the formation and downfall of great nations, more concretely the decisions and events which contributed to their predicaments. Though there are certain events in history which may have acted as a tipping point for the relative ascendancy of one nation to another, an Epicurean view of the world doesn’t do the power of human initiative and free will justice. Luck is an element, but moreover I believe it is specific human attitudes and decisions which play a much greater role in the fate of nations.

Travelling affords one an unbiased view into the formation process of nations, why some thrive and others stagnate. So the biggest reward for me when I travel is in drawing valuable lessons by viewing the history of the places I visit. Chile is one of those countries which throughout its more contemporary history has been gifted with natural resource discoveries, but in past instances have opted to squander them versus planning for the future. It is a country littered with examples of when a stable, isolated system is disrupted by an unstoppable external force.

Valparaiso

Valparaiso

Valparaiso, a port city on the coast of Chile is a favorite destination for me, given its relaxed atmosphere and its quaint similarities to San Francisco. But more importantly it is an anecdote for the continuous growth-and-decline cycle of companies, cities and nations.

Back in the 19th century, Valparaiso gained prominence as a major stopover port for ships rounding South America via the Strait of Magellan en route to the west coast of the U.S. or Asia. The economic prosperity of trade brought about Latin America’s first stock exchange and numerous other firsts in civil services. Yet the good times wouldn’t last long as the opening of the Panama Canal in 1914 dealt a severe blow for Valparaiso’s shipping volume. What is interesting is why Valparaiso never planned for the disruption given that the first construction attempt of the Panama Canal began in 1881. Why Valparaiso never adapted or made preparations for disruption is something I’m not prepared to answer at this stage, but the city does serve as a living reminder of the consequences when imagination fails and complacency takes over.

In some respects the influx of foreign entrepreneurs brought in by the Start-Up Chile program is at the very least a good external wake up call for Chile. Chilean society is still highly stratified with family name and where you live being a crude gauge of your position and power in society. As a foreigner, I was very much ignorant to this undercurrent of classism nor do I care much for it. It is my hope that myself and the other foreign entrepreneurs here can really catalyse the process of hacking away at this form of neo-serfdom.

The Chilean entrepreneurial scene is still very much a work in progress and a big hindrance to its long term success is the inherent classism that currently exists. In shifting the factors behind success and achievement in Chilean society from indeterminate probabilistic factors (e.g. family heritage, genetic luck) to determinate ones (hardwork, hustle, talent etc.), Chile can move one step closer to building a tech hub that contributes original and innovative companies. Otherwise it is highly likely to go down the road of 1-n globalisation, where companies are largely clones of existing and working ones in other parts of the world.

The World Is Truly Converging – In The Economic Sense

Vina del Mar

Vina del Mar

My first two month in South America were spent in Chile and to be quite honest, it didn’t feel like “South America”. The normal amenities of Starbucks, large shopping malls and Western brands are ever present in Chile to the extent that it becomes hard to distinguish the unique characteristics of the country.

I’ve had many instances in Chile, where I felt I could’ve just as easily been in San Francisco, New York, Hong Kong or any other major international hubs. Looking back at my childhood in the 90s, there was this real sense that the world was opening up, people were embracing multiculturalism and that the world was beginning to converge. During my time in Chile, I would still say that the convergence theory of globalisation still holds true, but only loosely in the economic sense.

On one hand it is nice to have the amenities that I’m used to be readily available in Chile, but it is also sad to see that my generation is all converging to the same brand of Western culture with little differentiation apart from the native tongue they speak. Whether this is a good thing or not really depends on the individual, but for me it is a little sad to see that we are losing the distinct characteristics that define who we are and our heritage.

Up until now I haven’t experienced much of Chilean culture, it is a country rushing to modernise and in some ways move on quickly from its ugly recent history. The country has done well economically with its free market policies and its large copper reserves, but whether it avoids another resource dependent boom-and-bust cycle really hinges on how well it plans for the future.

Between May and October 1648, a series of treaties were signed in Osnabruck and Munster respectively which ended the Thirty Years’ War in the Holy Roman Empire and the Eighty Years’ War between Spain and the Dutch Republic. What made the treaties of Westphalia important and notable was the establishment of the concept of a territorial sovereignty.

The introduction of the sovereign state signaled the end of perennial warfare in mainland Europe but also the triumph of sovereignty over empire. These set of treaties would lay the foundations for a large of modern international law. If one distills the fundamental change brought about from the Westphalia treaty to governance, one can see the coalescence of power as defined according to geographical bounds. In essence, Westphalia was a key force in demarcating the various peoples of Europe into territorial regions. It transferred governance from sovereignty over all those whom share similar ethnic traits to a more rigid system bound by defined geographical boundaries. Citizenship was no longer necessarily decreed based on ethnic lines, but now also in terms of the geographical location of the individuals.

Pre-Westphalia: Fluid territories and national boundaries defined as a function of settlement areas

Pre-Westphalia: Fluid territories and national boundaries defined as a function of settlement areas

The centralized power held by various ethnic rulers and emperors over all subjects of the same ethnic group was now controlled and tightly bound by pre-determined geographical bounds. Thus, Westphalia brought about a relative decentralization of the power held by rulers. As a result individuals free to determine the system of power and control they were subjected to, because mobility and emigration was made possible under the sovereign state framework.

Fast forward to today, the right to self-determination of the set of laws one is subjected to is all around us. Globalization and modern interracial societies (United States, United Kingdom) arguably trace their roots back to this legal framework. In paving the way for greater human mobility, Westphalia brought about a massive decentralization of political power, shifting it from traditional high priests and hereditary rulers to the general population.

Post-Westphalia: Concept of sovereign state established, fixed boundaries. More conducive for immigration

Post-Westphalia: Concept of sovereign state established, fixed boundaries. More conducive for immigration

A clear analogue to the decentralization of power by brought about by the Peace of Westphalia is the internet today. Traditional industries such as media which had held a monopoly on information are now being disrupted one by one by the power of the internet to connect and leverage the collective resources of a vast number of individuals. Think for a moment about the way in which the news industry has changed. Twenty years ago, news was very much top down. Breaking events were covered by news outlets, edited and then delivered to individual consumers. The power to collect and disseminate information lay in the hands of a small group of individuals. These individuals should they choose to had the ability to censor and manipulate given the lack of competition. Today, the advent of social platforms like Twitter and Facebook has given everyday users a channel to voice their opinions and their own observations of the world around them. The proliferation of mobile devices coupled with software and internet solutions has enabled many more individuals to publish and share their stories. By harnessing the collective power of localized reporters, the beauty in the current internet structure is the way in which it has created far more competition to the traditional news agencies.

The vast wealth of information stored online and the ability to connect with another individual across the world is hugely empowering. A couple of decades ago, formal education was largely the domain of the government and private universities, but the rise of MOOCs today threatens this traditional top-down one-size fits all structure of instruction. Whereas previous generation had a much more localized social experience, today we are able to connect and befriend individuals from the across the world. It is also a safe bet that we know someone half way across the world better than we know our neighbors.

It has been more than three centuries since the Peace of Westphalia. The internet today draws many parallels to the ways in which the peace treaty shaped civil liberties and individual freedoms. The internet has and will continue to chip away at the hegemony of traditional power structures in many areas from media to education.

At the core of the internet is its ability to decentralize power and offer tailored experiences to the individual. As more of the world comes online, the internet’s ability to connect and share ideas is incredibly exciting as a framework for how we should be re-evaluating the current legal framework that governs us. What does the ability to define identity and create groups of association via the web mean for definitions of citizenship? Could our current legal framework simply be a model for interfacing with the physical world, whilst a parallel framework develops on the internet? Or could the current legal framework be completely revamped?

2^8888

2^8888

Sometimes one takes on problems not for their practical purposes, but just for the sake of solving it. Since there is a limit on the number of digits that can be represented in a language like Javascript, I wanted to compute the actual output for two to a very large exponent.

One effective means of doing so is to compute 2^n and store the resulting digits in a list. The Javascript below completes this action:

var math = require('mathjs');
var log = console.log;

var sum = function(as) {
        var cc = 0;
        for(ii in as) {
                cc += as[ii];
        }
        return cc;
};

var power = function(base, k) {
        var output = [base];
        var temp = [];
        var init = 1;
        var index = 0;
        while(init < k) {
                for(ii in output) {
                        output[ii] *= 2;
                }
                if(output.length === 1 && output[0] >= 10) {
                        output.push(1);
                        output[0] = output[0] + '';
                        temp = output[0].split("")[1];
                        output[0] = Number(temp);
                }
                if(output.length === 2) {
                        if(output[0] >= 10) {
                                output[1] += 1;
                                output[0] = output[0] + '';
                                temp = output[0].split("")[1];
                                output[0] = Number(temp);
                        }
                        if(output[1] >= 10) {
                                output.push(1);
                                output[1] = output[1] + '';
                                temp = output[1].split("")[1];
                                output[1] = Number(temp);
                        }
                }
                if(output.length > 2) {
                        for(ii=0; ii <= (output.length - 2); ii++){
                                if(output[ii] >= 10) {
                                        output[ii+1] += 1;
                                        output[ii] = output[ii] + '';
                                        temp = output[ii].split("")[1];
                                        output[ii] = Number(temp);
                                }
                        }
                        index = output.length - 1;
                        if(output[index] >= 10) {
                                output[index] = output[index] + '';
                                temp = output[index].split("")[1];
                                output[index] = Number(temp);
                                output.push(1);
                        }
                }
                init += 1;
        }
        return output;
}

var agg = function(as) {
        var output = as[as.length - 1].toString();
        for(ii = (as.length - 2); ii >= 0; ii--) {
                as[ii] = as[ii].toString()
                output = output + as[ii];
        }
        return output;
};

Using this script, we can compute 2^8888 (output in image) which has a digit sum of 12055.

Collatz sequence stopping time for first 1,000 integers

Collatz sequence stopping time for first 1,000 integers

The Collatz sequence is defined by the following rule:

n = n/2 if n is even
n = 3n + 1 if n is odd

Based on this rule and starting with 20; the following sequence is generated:

20 – 10 – 5 – 16 – 8 – 4 – 2 – 1


var math = require('mathjs');
var log = console.log;

var integer = function(n) {
        var output = [];
        for(ii = 1; ii <= n; ii++) {
                output.push(ii);
        }
        return output;
};

var collatz = function(k) {
        var output = [k];
        var n = k;
        while(n > 1) {
                if(n % 2 === 0) {
                        n /= 2;
                        output.push(n);
                } else {
                        n = (3*n + 1);
                        output.push(n);
                }
        }
        return output.length;
}

var colsearch = function(j) {
        var intlist = integer(j);
        var output = [];
        for(ii in intlist) {
                output.push(collatz(intlist[ii]));
        }
        return output;
}

var isearch = function(as) {
        var output = [];
        var max = math.max(as);
        var acc = 0;
        for(ii in as) {
                if(as[ii] === max) {
                        acc = Number(ii) + 1;
                        output.push(acc);
                }
        }
        return output;
};

Using this code the starting number up to 1 million which produces the longest Collatz sequence is 837799

Follow

Get every new post delivered to your Inbox.