blog Archive

0

The London riots of 2011


riotmap

Today saw our new paper on the London Riots published as part of the nature series in Scientific Reports. The work began a few months after the events of summer 2011 – in which London and England experienced the worst period of sustained rioting, violence, arson seen in Great Britain for over 20 years.

Once things had calmed down, two key questions about the events remained: why were some areas of the cities so much more susceptible than others, and second, was there anything the police could have done to bring about a swifter resolution to the unrest? Motivated by these issues, we wanted to create a mathematical replica, or simulation of the events, capable of answering some of these questions.

Our research group at UCL (lead by Sir Alan Wilson in CASA) teamed up with UCL department of Crime Science and the Metropolitan Police, who kindly gave us access to their records of all arrests made in connection with the events. By going through the data, we were able to analyse the patterns seen across London and to create a simulation of rioting across the city by draw analogies with three well known and well studied models from across the human and natural world.

First we simulate the initial decision of a resident to riot. The data shows that events escalated on each day of the riots to a peak in the evening, calming down in the small hours as police began to regain control of the city. Graphs showing this rioter involvement over time are reminiscent of similar plots of seasonal flu outbreaks from year to year, the recycling of various fashions over time, or the spread of a rumour/meme on the internet. By drawing an analogy with ‘the idea to riot’ and these feedback processes (the more people with flu, the more people will be exposed to flu etc) we can simulate the spread of the idea to riot across the city. We took into account that people would be less likely to be involved if there was a heavy police presence (or more specifically, a high chance of being arrested) and, since it was found in the data that rioters tended to come from some of the most deprived areas of the city, assumed that residents in deprived areas were probabilistically more susceptible to rioting

Once a resident in the simulation has decided to riot, it remains for them to choose where to offend. The data showed that rioters tended to choose retail centres relatively close to where they lived, and our model reflects this finding. Perhaps unsurprisingly, this pattern is also found in data on where people choose to shop, and so our simulation takes the form of a traditional retail model – people tend to prefer shopping locally, but are prepared to travel further for a larger retail centre, and the same appears to be true of rioters. This analogy also seems to support the rather flippant headlines seen after the events of ‘shopping with violence’.

Finally, once a simulated rioter is involved in a riot, they begin to interact with the police. We assume that rioters are generally trying to evade arrest, while police are generally trying to reduce rioter numbers, which is reminiscent of the motivations of predators and prey in the wild (officially, Lotka-Volterra models of population dynamics)  –  a very old and well studied problem in mathematics. This forms the final stage of our model, allowing us to test the rioters reaction to (and hence effectiveness of) different policing strategies which theoretically could be used in future to assist the police in bringing about a swifter resolution to any unrest.

The map above is one example of our results (b) compared to the full events themselves (a). Our aim is not to replicate the full riots blow by blow (there is far too much dependence on ‘initial conditions’ for that) but regardless, we get very good qualitative agreement, with 26 of the 33 boroughs showing rioter percentages in the same or adjacent bands as the data.

You can find the paper in all of its delicious glory right here:

http://www.nature.com/srep/2013/130221/srep01303/full/srep01303.html

While links to deprivation and government cuts to youth services were not the subject or focus of the Scientific Reports paper, this connection did lead me to set up a splinter project, the result of which can be seen below. Happily, it also contains a rather thorough explanation of the model used in the paper which some readers might find useful:

I should point out that the motivations and conclusions of this piece differ from the focus of the paper – some of the work in this video has not been peer reviewed and does not necessarily represent the views of the other authors etc etc etc. blah blah blah. For more info on this project, see my other blog post here.

Tags:
0

If you ever wanted to test drive a spatial interaction model, here’s your chance…

Hello blog, we haven’t seen each other for a while – I’ve been meaning to call, but you know how it is, just been so busy with work and all that…

What do you mean I only visit when I want to use you for something? That’s not fair, I do check up on you now and again, although you’ve been looking a little forlorn – like you’ve not been looking after yourself properly –  so I’ve kept my distance.

I’ve brought you a present though! I brand-spanking new re-hash of some work I did last year, but that’s been sitting on my hard drive waiting to be cobbled together into something approaching a paper…

The lovely folks at CASA have honoured it with working paper status, so if anyone wants a step-by-step guide of how to run a spatial interaction model in R, then this is the place to visit:

http://www.bartlett.ucl.ac.uk/casa/pdf/paper181

I know you like pictures, so before I go here’s a tasty little number from that very paper

0

Converting Latitude and Longitude to British National grid

This code reads in a .csv file called LatLon, expecting two columns with headers – Latitude and Longitude (in WGS84, decimal form). If the script is run in the same directory as LatLon.csv, it will spit out a second file called LatLonandBNG.csv, with two additional columns: OSGB36 Eastings and Northings respectively.

For the inverse transform - OSGB36 to WGS84 – please refer to this post, where a you can find the relevant python script, and more details on the algorithms involved.

If you don’t have python, you can find instructions for the bits you need on another of my previous posts.

#This code converts lat lon (WGS84) to british national grid (OSBG36)

from scipy import *

import csv

def WGS84toOSGB36(lat, lon):

    #First convert to radians
    #These are on the wrong ellipsoid currently: GRS80. (Denoted by _1)
    lat_1 = lat*pi/180
    lon_1 = lon*pi/180

    #Want to convert to the Airy 1830 ellipsoid, which has the following:
    a_1, b_1 =6378137.000, 6356752.3141 #The GSR80 semi-major and semi-minor axes used for WGS84(m)
    e2_1 = 1- (b_1*b_1)/(a_1*a_1)   #The eccentricity of the GRS80 ellipsoid
    nu_1 = a_1/sqrt(1-e2_1*sin(lat_1)**2)

    #First convert to cartesian from spherical polar coordinates
    H = 0 #Third spherical coord.
    x_1 = (nu_1 + H)*cos(lat_1)*cos(lon_1)
    y_1 = (nu_1+ H)*cos(lat_1)*sin(lon_1)
    z_1 = ((1-e2_1)*nu_1 +H)*sin(lat_1)

    #Perform Helmut transform (to go between GRS80 (_1) and Airy 1830 (_2))
    s = 20.4894*10**-6 #The scale factor -1
    tx, ty, tz = -446.448, 125.157, -542.060 #The translations along x,y,z axes respectively
    rxs,rys,rzs = -0.1502, -0.2470, -0.8421  #The rotations along x,y,z respectively, in seconds
    rx, ry, rz = rxs*pi/(180*3600.), rys*pi/(180*3600.), rzs*pi/(180*3600.) #In radians
    x_2 = tx + (1+s)*x_1 + (-rz)*y_1 + (ry)*z_1
    y_2 = ty + (rz)*x_1  + (1+s)*y_1 + (-rx)*z_1
    z_2 = tz + (-ry)*x_1 + (rx)*y_1 +  (1+s)*z_1

    #Back to spherical polar coordinates from cartesian
    #Need some of the characteristics of the new ellipsoid
    a, b = 6377563.396, 6356256.909 #The GSR80 semi-major and semi-minor axes used for WGS84(m)
    e2 = 1- (b*b)/(a*a)   #The eccentricity of the Airy 1830 ellipsoid
    p = sqrt(x_2**2 + y_2**2)

    #Lat is obtained by an iterative proceedure:
    lat = arctan2(z_2,(p*(1-e2))) #Initial value
    latold = 2*pi
    while abs(lat - latold)>10**-16:
        lat, latold = latold, lat
        nu = a/sqrt(1-e2*sin(latold)**2)
        lat = arctan2(z_2+e2*nu*sin(latold), p)

    #Lon and height are then pretty easy
    lon = arctan2(y_2,x_2)
    H = p/cos(lat) - nu

    #E, N are the British national grid coordinates - eastings and northings
    F0 = 0.9996012717                   #scale factor on the central meridian
    lat0 = 49*pi/180                    #Latitude of true origin (radians)
    lon0 = -2*pi/180                    #Longtitude of true origin and central meridian (radians)
    N0, E0 = -100000, 400000            #Northing & easting of true origin (m)
    n = (a-b)/(a+b)

    #meridional radius of curvature
    rho = a*F0*(1-e2)*(1-e2*sin(lat)**2)**(-1.5)
    eta2 = nu*F0/rho-1

    M1 = (1 + n + (5/4)*n**2 + (5/4)*n**3) * (lat-lat0)
    M2 = (3*n + 3*n**2 + (21/8)*n**3) * sin(lat-lat0) * cos(lat+lat0)
    M3 = ((15/8)*n**2 + (15/8)*n**3) * sin(2*(lat-lat0)) * cos(2*(lat+lat0))
    M4 = (35/24)*n**3 * sin(3*(lat-lat0)) * cos(3*(lat+lat0))

    #meridional arc
    M = b * F0 * (M1 - M2 + M3 - M4)          

    I = M + N0
    II = nu*F0*sin(lat)*cos(lat)/2
    III = nu*F0*sin(lat)*cos(lat)**3*(5- tan(lat)**2 + 9*eta2)/24
    IIIA = nu*F0*sin(lat)*cos(lat)**5*(61- 58*tan(lat)**2 + tan(lat)**4)/720
    IV = nu*F0*cos(lat)
    V = nu*F0*cos(lat)**3*(nu/rho - tan(lat)**2)/6
    VI = nu*F0*cos(lat)**5*(5 - 18* tan(lat)**2 + tan(lat)**4 + 14*eta2 - 58*eta2*tan(lat)**2)/120

    N = I + II*(lon-lon0)**2 + III*(lon- lon0)**4 + IIIA*(lon-lon0)**6
    E = E0 + IV*(lon-lon0) + V*(lon- lon0)**3 + VI*(lon- lon0)**5 

    #Job's a good'n.
    return E,N

#Read in from a file
LatLon = csv.reader(open('LatLon.csv', 'rU'), delimiter = ',')
LatLon.next()

#Get the output file ready
outputFile = open('LatLonandBNG.csv', 'wb')
output=csv.writer(outputFile,delimiter=',')
output.writerow(['Lat', 'Lon', 'E', 'N'])

#Loop through the data
for line in LatLon:
    lat = line[0]
    lon = line[1]
    E, N = WGS84toOSGB36(float(lat), float(lon))
    output.writerow([str(lat), str(lon), str(E), str(N)])

#Close the output file
outputFile.close()
Tags:
0

Converting British National Grid to Latitude and Longitude II

A few months ago, I wrote a python script to convert British National grid coordinates (OSGB36) to latitude and longitude (WGS84).  A fellow blogger Andrzej Bieniek very kindly pointed out that the algorithm was only accurate to around 100m because of an additional subtlety which I hadn’t taken into account.

I’ll have a quick bash at explaining the reasoning behind this difference,  but if you want to skip to the new version of the code (now accurate to 5m) I’ve posted it at the bottom of the page in all its delicious glory. Alternatively, if you are looking for a similarly accurate script applying the inverse transform – ie. WGS84 lat, lon to OSGB36 Eastings, Northings – you can find it here.

Because the Earth’s surface is not smooth, any attempt to represent a geographical area in two dimensions  - on paper, on screen etc – will naturally result in some distortions and discontinuities. One way to get round this, is to approximate the Earth using a ’3D ellipse’ (or ‘ellipsoid’, if we’re being proper) try and fit it as best you can to the Earth’s squashed sphere shape, cross your fingers & not get too worried about ‘land’, ‘oceans’ and other such trivial inconveniences.

If you’re looking to consider large distances, or perhaps the whole globe at once – as is the case with WGS84 –  you’ll need to fit the ellipsoid to the entire Earth’s surface at once.  If, however, you’re only interested in mapping a small area of the globe –  as is the case with British National Grid  - it may be that a much smaller ellipse would work much better:

Of course, the picture above – taken from the Ordinance Survey website – is a greatly exaggerated illustration of this idea, but I think it demonstrates the concept pretty well.  This is exactly what I hadn’t realised when doing my previous post: the ellipsoids that are used for the two systems of British National grid and WGS84 latitude and longitude are ever so slightly different – hence why the results could be up to 100m out. British National grid is based on the Airy* 1830 ellipsoid, while latitude and longitude use the GSR80 (or WGS 1984).

The origins of these two, A1830 and GSR80 are in slightly different positions, their axes are a little rotated, and one is a tiny bit larger than the other – as in the picture below.

It’s not sufficient to just find  ‘lat/lon style’ coordinates from Eastings and Northings. You’ve also got to take the differences between the two ellipsoids before you can reproduce the latitude and longitude coordinates.

So, on top of the original post, to convert between the two ellipsoids you must scale, translate and rotate every point on the surface.

Actually, once you get down to it, the maths isn’t that bad – although the transform is a great deal simpler if you first convert to Cartesian coordinates, and that’s the only annoying bit. The full process is actually all contained on the Ordinance Survey guide  (just annoyingly not all in one place) and can be summarised as follows:

1) Use your Eastings and Northings to find lat/lon style coordinates on the Airy 1830 ellipsoid. (As in the previous post, or on pages 41&42 of the guide).

2) Convert those lat/lon coords to 3D Cartesian coordinates. (As on page 38 of the guide). Take the third spherical polar coordinate – height –   as zero in your calculations. (As though the point sits on the surface of the ellipsoid).

3) Use Helmert’s datum transform to convert between the two ellipsoids, and anchor your coordinates on the GSR80 system. (As on page 29 of the guide).

4) Your answer will be in Cartesian coordinates, so the final step is to convert back to spherical polars to get the final lat/lon. (As on page 39 of the guide).

All the constants you will need are also there in the guide – although you might have to hunt for them a little. Or just nick ‘em from my script below.

As before, the code reads in a .csv file called BNG, expecting two columns with headers – East and North. If the script is run in the same directory as BNG.csv, it will spit out a second file called BNGandLatLon.csv, with both sets of coordinate (lat and lon will be in decimal form). If you don’t have python, you can find instructions for the bits you need on another of my previous posts. Aren’t I nice to you?

*The same chap as Airy’s function, which I virtually have tattooed on the insides of my eyelids from my fluid dynamics days. Not that this is remotely interesting, I just think he’s a top fella is all.

#This code converts british national grid to lat lon

from math import sqrt, pi, sin, cos, tan, atan2 as arctan2
import csv

def OSGB36toWGS84(E,N):

    #E, N are the British national grid coordinates - eastings and northings
    a, b = 6377563.396, 6356256.909     #The Airy 180 semi-major and semi-minor axes used for OSGB36 (m)
    F0 = 0.9996012717                   #scale factor on the central meridian
    lat0 = 49*pi/180                    #Latitude of true origin (radians)
    lon0 = -2*pi/180                    #Longtitude of true origin and central meridian (radians)
    N0, E0 = -100000, 400000            #Northing & easting of true origin (m)
    e2 = 1 - (b*b)/(a*a)                #eccentricity squared
    n = (a-b)/(a+b)

    #Initialise the iterative variables
    lat,M = lat0, 0

    while N-N0-M >= 0.00001: #Accurate to 0.01mm
        lat = (N-N0-M)/(a*F0) + lat;
        M1 = (1 + n + (5./4)*n**2 + (5./4)*n**3) * (lat-lat0)
        M2 = (3*n + 3*n**2 + (21./8)*n**3) * sin(lat-lat0) * cos(lat+lat0)
        M3 = ((15./8)*n**2 + (15./8)*n**3) * sin(2*(lat-lat0)) * cos(2*(lat+lat0))
        M4 = (35./24)*n**3 * sin(3*(lat-lat0)) * cos(3*(lat+lat0))
        #meridional arc
        M = b * F0 * (M1 - M2 + M3 - M4)          

    #transverse radius of curvature
    nu = a*F0/sqrt(1-e2*sin(lat)**2)

    #meridional radius of curvature
    rho = a*F0*(1-e2)*(1-e2*sin(lat)**2)**(-1.5)
    eta2 = nu/rho-1

    secLat = 1./cos(lat)
    VII = tan(lat)/(2*rho*nu)
    VIII = tan(lat)/(24*rho*nu**3)*(5+3*tan(lat)**2+eta2-9*tan(lat)**2*eta2)
    IX = tan(lat)/(720*rho*nu**5)*(61+90*tan(lat)**2+45*tan(lat)**4)
    X = secLat/nu
    XI = secLat/(6*nu**3)*(nu/rho+2*tan(lat)**2)
    XII = secLat/(120*nu**5)*(5+28*tan(lat)**2+24*tan(lat)**4)
    XIIA = secLat/(5040*nu**7)*(61+662*tan(lat)**2+1320*tan(lat)**4+720*tan(lat)**6)
    dE = E-E0

    #These are on the wrong ellipsoid currently: Airy1830. (Denoted by _1)
    lat_1 = lat - VII*dE**2 + VIII*dE**4 - IX*dE**6
    lon_1 = lon0 + X*dE - XI*dE**3 + XII*dE**5 - XIIA*dE**7

    #Want to convert to the GRS80 ellipsoid. 
    #First convert to cartesian from spherical polar coordinates
    H = 0 #Third spherical coord. 
    x_1 = (nu/F0 + H)*cos(lat_1)*cos(lon_1)
    y_1 = (nu/F0+ H)*cos(lat_1)*sin(lon_1)
    z_1 = ((1-e2)*nu/F0 +H)*sin(lat_1)

    #Perform Helmut transform (to go between Airy 1830 (_1) and GRS80 (_2))
    s = -20.4894*10**-6 #The scale factor -1
    tx, ty, tz = 446.448, -125.157, + 542.060 #The translations along x,y,z axes respectively
    rxs,rys,rzs = 0.1502,  0.2470,  0.8421  #The rotations along x,y,z respectively, in seconds
    rx, ry, rz = rxs*pi/(180*3600.), rys*pi/(180*3600.), rzs*pi/(180*3600.) #In radians
    x_2 = tx + (1+s)*x_1 + (-rz)*y_1 + (ry)*z_1
    y_2 = ty + (rz)*x_1  + (1+s)*y_1 + (-rx)*z_1
    z_2 = tz + (-ry)*x_1 + (rx)*y_1 +  (1+s)*z_1

    #Back to spherical polar coordinates from cartesian
    #Need some of the characteristics of the new ellipsoid    
    a_2, b_2 =6378137.000, 6356752.3141 #The GSR80 semi-major and semi-minor axes used for WGS84(m)
    e2_2 = 1- (b_2*b_2)/(a_2*a_2)   #The eccentricity of the GRS80 ellipsoid
    p = sqrt(x_2**2 + y_2**2)

    #Lat is obtained by an iterative proceedure:   
    lat = arctan2(z_2,(p*(1-e2_2))) #Initial value
    latold = 2*pi
    while abs(lat - latold)>10**-16: 
        lat, latold = latold, lat
        nu_2 = a_2/sqrt(1-e2_2*sin(latold)**2)
        lat = arctan2(z_2+e2_2*nu_2*sin(latold), p)

    #Lon and height are then pretty easy
    lon = arctan2(y_2,x_2)
    H = p/cos(lat) - nu_2

    #Uncomment this line if you want to print the results
    #print [(lat-lat_1)*180/pi, (lon - lon_1)*180/pi]

    #Convert to degrees
    lat = lat*180/pi
    lon = lon*180/pi

    #Job's a good'n. 
    return lat, lon

#Read in from a file
BNG = csv.reader(open('BNG.csv', 'rU'), delimiter = ',')
BNG.next()

#Get the output file ready
outputFile = open('BNGandLatLon.csv', 'wb')
output=csv.writer(outputFile,delimiter=',')
output.writerow(['Lat', 'Lon', 'E', 'N'])

#Loop through the data
for E,N in BNG:
    lat, lon = OSGB36toWGS84(float(E), float(N))
    output.writerow([str(lat), str(lon), str(E), str(N)])
#Close the output file
outputFile.close()
Tags:
0

Working paper fun

Just to prove that I’ve not been twiddling my thumbs for the last 12 months, here’s the first (of hopefully many) working papers to come out of the ENFOLDing Migration stable…

Alan Wilson and I have been working on a new family of Spatial Interaction Models for estimating inter-regional migration flows in Europe. We’ve developed a new multi-level model which uses internal migration data at the regional level and international migration data at the country level to model international regional level flows.

Once I’ve finished tidying up the code, I’ll post the full model here too for people to play with – at the moment though, you’ll just have to make do with the paper, which can be found here:

http://www.bartlett.ucl.ac.uk/casa/pdf/paper175.pdf

And here’s a pretty picture from it!

Large migrant flows  from the UK

 

0

Flows Across Scales

At the Complex-City Workshop held at the Tinbergen Institute in the Free University of Amsterdam held 5-6 December 2011, CASA presented a variety of papers. Alan Wilson presented a paper by Hannah Fry and himself on a 4 sector world trade model calibrated so far on 50 countries. The model essentially computes demand for trade across the four sectors and then balances demand and supply keeping supply fixed and adjusting demand using the balancing factor as prices. This deals with the fast synamics and thence it is embedded into a slower dynamic framework based on Lotka-Volterra dynamics that adjust supply in which prices are also key.

Francesca Medda presented a model of urban evolution based on Turing’s morphogenesis ideas showing how shocks could rock the locational dynamics which emerge in a hypothetical world of locations around a central business district. Mike Batty presented his speculation that there are seven laws of scaling, a variant of his paper last week to the Scaling in Social Systems conference at Oxford (click here for the pdf) and today Simone Cashili’s paper with Alessandro Chessa and Andre de Montis on commuter networks and community detection will be presented by Andrea. Papers will be posted in due course and will be published in various forums’.

 As an adjunct to our concern for flows across space, Joan Serras on CASA’s SCALE project has generated some interesting visualising of food aid which we will post here very soon. These parallel the flows of food which is one sector in the Fry-Wilson model noted above

0

Bezier Curves

I wonder how long I can carry on working at the Centre for Advanced Spatial Analysis and still manage to avoid learning any GIS.

This week: Bezier curves, how to draw them in Python and particularly, how to decide where to put the control points.

A Bezier curve, is a special type of parametric curve used frequently in computer graphics. You may have seen them in Powerpoint, or using the pen tool in illustrator, but for my purpose, they are a lovely way to visualise flow paths on a map, like this:



This particular map shows the home addresses and riot locations of a subset of suspects involved in the London riots. The data was kindly given to us by Alex Singleton.

The curves themselves are anchored at both ends (origin P0, destination P3) and get their shape from two control points (blue P1 and red P2), as in the following picture:

The curve itself is created from the fairly simple expression (everything in bold is a vector):

 \textbf{B}(t)=(1-t)^3\textbf{P}_0+3t(1-t)^2\textbf{P}_1+3t^2(1-t)\textbf{P}_2+t^3\textbf{P}_3,\;\;\;t\in(0,1)

The parameterisation t starts from zero, so that the curve begins at P0, and as it increases bends the line towards the two control points, first P1 and then P2. Finally, when t  is 1, the line finishes at P3.

It’s all very clever stuff, and there is lots and lots of literature available on the web about them. But – as far as I can see – no one ever tells you how to decide where to put your control points, and certainly doesn’t offer an algorithm to automatically do so.

But fear not people! I have written one. And ugly and fudgy as it is, it does the job.

The code below takes 6 inputs:

-origin points,

-destination points,

-a length for the blue line,

-a length for the red line (both as a fraction of the length of the green line)

-the angle between the blue and green lines

-the angle between the red and green lines (both in radians. I am a mathematician after all).

And it will spit out a totally delicious plot of your curves.

It’s set up to bend the curves clockwise (as in the image at the top of the page), which generally looks nicer when plotting more than one bezier curve.

Have a play with the last four inputs: the longer the length, the more the curve will deviate from the green line. The higher the angle, the steeper the curve will roll in to it’s destination. Indeed, choosing the angles to be on the other side of the green line will result in an s-shaped curve. Anyway – the idea is for you to hack out the bits of the code that are useful.

import scipy as sp
import pylab as plt

#### Inputs

#A list of P0's and P3's. Must be the same length
origins = [[1,0],[-0.5,sp.sqrt(3)/2], [-0.5,-sp.sqrt(3)/2]]
destinations = [[0,0],[0,0],[0,0]]

#The angle the control point will make with the green line
blue_angle = sp.pi/6
red_angle = sp.pi/4

#And the lengths of the lines (as a fraction of the length of the green one)
blue_len = 1./5
red_len = 1./3

### Workings

#Generate the figure
fig = plt.figure()
ax = fig.add_subplot(111)
ax.hold(True)

#Setup the parameterisation
t = sp.linspace(0,1,100)

for i in xrange(len(origins)):
    #Read in the origin & destination points
    POx,POy = origins[i][0], origins[i][1]
    P3x,P3y = destinations[i][0], destinations[i][1]

    #Add those to the axes
    ax.plot(POx,POy, 'ob')
    ax.plot(P3x,P3y, 'or')
    ax.plot((POx,P3x),(POy,P3y), 'g')

    #Work out r and theta (as if based at P3)
    r = ((POx-P3x)**2 + (POy-P3y)**2)**0.5
    theta = sp.arctan2((POy-P3y),(POx-P3x))

    #Find the relevant angles for the control points
    aO =  theta + blue_angle+ sp.pi
    aD = theta - red_angle

    #Work out the control points
    P1x, P1y = POx+ blue_len*r*sp.cos(aO), POy + blue_len*r*sp.sin(aO)
    P2x, P2y = P3x+ red_len*r*sp.cos(aD), P3y + red_len*r*sp.sin(aD)

    #Plot the control points and their vectors
    ax.plot((P3x,P2x),(P3y,P2y), 'r')
    ax.plot((POx,P1x),(POy,P1y), 'b')
    ax.plot(P1x, P1y, 'ob')
    ax.plot(P2x, P2y, 'or')

    #Use the Bezier formula
    Bx = (1-t)**3*POx + 3*(1-t)**2*t*P1x + 3*(1-t)*t**2*P2x + t**3*P3x
    By = (1-t)**3*POy + 3*(1-t)**2*t*P1y + 3*(1-t)*t**2*P2y + t**3*P3y

    #Plot the Bezier curve
    ax.plot(Bx, By, 'k')

#Save it
plt.savefig('totally-awesome-bezier.png')
plt.show()
#Bosch.
Tags:
0

Converting British National Grid to Latitude and Longitude I

EDIT: Andrzej Bieniek brought to my attention that this version is correct to 100m. For a more accurate script (accurate to 5m) see my new post.

I have recently started to deal with a lot of geographical data in my research and and begun to realise it is difficult to go to long without stumbling across the sticky world of map projections – something I knew almost nothing about a year ago.

There is a nice blog post explaining the background by James Cheshire which I shan’t attempt to reproduce, but explanations aside, I found myself today trying to convert a long list of British National grid coordinates into Latitude and Longitude.

Of course, if you’ve got a week spare, you can do these conversions using QGis or ArcGis, but all I was looking for was just a quick .csv to .csv converter  of one coordinate system to the other. After a fruitless half hour of googling, I decided it would just be quicker to write a python script myself to get the job done. And because I know I’ll use it again, here it is.

The code reads in a .csv file called BNG, expecting two columns with headers – East and North. If the script is run in the same directory as BNG.csv, it will spit out a second file called BNGandLatLon.csv, with both sets of coordinate. All the maths behind the conversion can be found on page 41 of a pdf written by the chaps at ordnance survey. Sorry about the lack of highlighting – I’ll sort it out when I have a spare moment.

#This code converts british national grid to lat lon

from scipy import *
import csv

def OSGB36toWGS84(E,N):

    #E, N are the British national grid coordinates - eastings and northings
    a, b = 6377563.396, 6356256.909     #The Airy 180 semi-major and semi-minor axes used for OSGB36 (m)
    F0 = 0.9996012717                   #scale factor on the central meridian
    lat0 = 49*pi/180                    #Latitude of true origin (radians)
    lon0 = -2*pi/180                    #Longtitude of true origin and central meridian (radians)
    N0, E0 = -100000, 400000            #Northing & easting of true origin (m)
    e2 = 1 - (b*b)/(a*a)                #eccentricity squared
    n = (a-b)/(a+b)

    #Initialise the iterative variables
    lat,M = lat0, 0

    while N-N0-M >= 0.00001: #Accurate to 0.01mm
        lat = (N-N0-M)/(a*F0) + lat;
        M1 = (1 + n + (5/4)*n**2 + (5/4)*n**3) * (lat-lat0)
        M2 = (3*n + 3*n**2 + (21/8)*n**3) * sin(lat-lat0) * cos(lat+lat0)
        M3 = ((15/8)*n**2 + (15/8)*n**3) * sin(2*(lat-lat0)) * cos(2*(lat+lat0))
        M4 = (35/24)*n**3 * sin(3*(lat-lat0)) * cos(3*(lat+lat0))
        #meridional arc
        M = b * F0 * (M1 - M2 + M3 - M4)          

    #transverse radius of curvature
    nu = a*F0/sqrt(1-e2*sin(lat)**2)
    #meridional radius of curvature
    rho = a*F0*(1-e2)*(1-e2*sin(lat)**2)**(-1.5)
    eta2 = nu/rho-1

    secLat = 1./cos(lat)
    VII = tan(lat)/(2*rho*nu)
    VIII = tan(lat)/(24*rho*nu**3)*(5+3*tan(lat)**2+eta2-9*tan(lat)**2*eta2)
    IX = tan(lat)/(720*rho*nu**5)*(61+90*tan(lat)**2+45*tan(lat)**4)
    X = secLat/nu
    XI = secLat/(6*nu**3)*(nu/rho+2*tan(lat)**2)
    XII = secLat/(120*nu**5)*(5+28*tan(lat)**2+24*tan(lat)**4)
    XIIA = secLat/(5040*nu**7)*(61+662*tan(lat)**2+1320*tan(lat)**4+720*tan(lat)**6)
    dE = E-E0

    lat = lat - VII*dE**2 + VIII*dE**4 - IX*dE**6
    lon = lon0 + X*dE - XI*dE**3 + XII*dE**5 - XIIA*dE**7

    #Convert to degrees
    lat = lat*180/pi
    lon = lon*180/pi

    return lat, lon

#Read in from a file
BNG = csv.reader(open('BNG.csv', 'rU'), delimiter = ',')
BNG.next()

#Get the output file ready
outputFile = open('BNGandLatLon.csv', 'wb')
output=csv.writer(outputFile,delimiter=',')
output.writerow(['Lat', 'Lon', 'E', 'N'])

#Loop through the data
for E,N in BNG:
    lat, lon = OSGB36toWGS84(float(E), float(N))
    output.writerow([str(lat), str(lon), str(E), str(N)])
#Close the output file
outputFile.close()
Tags:
0

Some stuff on the London riots

 

In the past few weeks our mathmo team: Toby Davies, Peter Baudains and myself, have been looking into some of the reasons why the London riots happened.

If we can suppose that there was some rationality behind the rioters actions, then we may construct a mathematical model from a series of simple rules about their behaviour.

If this model is capable of replicating some of the emerging features of the riots themselves, then, working backwards, we can start to say something about the causes of the riots, and perhaps even something about improvements which could be made to police strategy.

There is a fairly big if  on the rationality front,  so I refer you to a nice article called Riots and Rationality which discusses a similar argument, and puts it much more elegantly than I can.

However, if you’d like to hear more of my ramblings, I did an interview for the global lab podcast on this very topic, amongother things, which you can listen to here.

We will also be publishing our findings relatively soon, and I’ll post a link to the paper here when it’s ready.

Tags:
0

New paper from me and the sad end of an era…

Yes, British migration classification fans, it’s the moment you’ve all been waiting for – publication of the CIDER Migration Classification paper! *Cue delirious cheering, whooping, hollering and cries of ‘get in the hole!!’*

This most recent product of my blood, sweat and tears – co-authored by John Stillwell – can be found in the latest edition of Population Trends. I say latest, but I should also say final, as sadly after 36 years of publication Population Trends is from today, no more. In 2010 the Office for National Statistics took the decision to cease regular publication of all of  its journal titles – I can only speculate the victim of the severe cuts being enforced across the public sector currently.

The Autumn 2011 issue really does mark the end of an era and I will be sad to see it go, especially as I think I can safely say that it is the journal, above all others, that has been the source of useful and practical research during my academic career.

Tags: