enfold Archive

0

Research Associate and PhD Studentship on ENFOLDing

Applications are now being invited for two positions on the ENFOLDing project, based within the Centre for Advanced Spatial Analysis at UCL: a Postdoctoral Research Associate within the Department of Security and Crime Science and a PhD Studentship within the Department of Mathematics.

Research Associate ENFOLDing project

Applications are invited for a full-time researcher to initiate and contribute to the delivery of a programme of high quality research into the use of applied mathematics in the study of issues of security (e.g. conflict, international crime) and related topics. The successful candidate will work on the ENFOLDing project, which is an interdisciplinary project funded by the Engineering and Physical Sciences Research Council (EPSRC). This will involve working alongside, and collaborating with, members from the Centre of Advanced Spatial Analysis, Geography, Transport Studies, Political Science, the Bartlett School, Mathematics and the Department of Security and Crime Science.

The post will be available from October 2011 and is funded for three years in the first instance.

Key Requirements

The successful candidate will have expertise in mathematical and/or statistical modelling and/or complex systems and the ability to develop models related to security and crime. A PhD in mathematics, political science, criminology, statistics, or a related area will be required along with excellent written and verbal communication skills, and programming experience in any of the programming languages C#, C++, Java, Stata, R or evidence of ability to learn new programming languages.

For more information, and to apply, go to: http://www.ucl.ac.uk/hr/jobs/

PhD Studentship ENFOLDing project

Applications are invited for a PhD studentship to work on an Engineering and Physical Sciences Research Council (EPSRC), UK-funded, interdisciplinary project that seeks to develop complex systems modelling. The project within ENFOLD involves working alongside members from the Centre of Advanced Spatial Analysis, Geography, Transport Studies, Political Science, the Bartlett School, Mathematics and the Department of Security and Crime Science.

Studentship Description

Traditionally, the science used to inform policy makers about future social and economic events has been based on models which treat global systems such as trade and migration in isolation. By ignoring the coupling and interaction between such systems, unexpected dynamics can occur which in turn limits the extent to which such models can be applied when influencing policy. The ENFOLD project aims to tackle this dilemma. By focusing on four key work streams – trade, migration, security (including crime, terrorism and conflict) and development aid – the dynamics of these interactions and couplings may be studied and modelled. A Global Intelligence System will incorporate these ideas, span many spatial and temporal scales and contain interacting reaction diffusion and network models, described in the conventional languages of complexity theory: chaos, bifurcations, turbulence, catastrophes, and phase transition. These dynamic and nonlinear models will be applied to and assessed against existing data, eventually informing global policy makers about future events and helping to develop appropriate policy responses.

Each of the four work streams has a small group of dedicated researchers. The role of the successful candidate on the fifth work stream – developing tools of complexity science – will be to work towards the development of new mathematical models, and the analysis of existing ones (for example cellular automata, agent-based modelling, reaction-diffusion and Lotka-Volterra models).

There will be opportunities for multi-disciplinary training of the doctoral candidate since the programme ENFOLDing is based on ten academic faculty at UCL spanning a wide range of centres and departments.

All the fees will be paid for only UK and EU nationals. The studentship lasting for three years will cover tuition fees at home rate plus standard living expenses. London allowance will apply.

Person Specification

Applicants should have a strong background in Mathematics or similar quantitative subject, usually with a First or Upper Second class honours degree, demonstrate good self-motivation, and be willing to work as part of an interdisciplinary team. It is beneficial if applicants have experience in numerical modelling and programming.

For more information, and to apply, go to: http://www.ucl.ac.uk/hr/jobs/

Tags:
0

The importance of being discrete.

If we’re being accurate, the title should really be “The importance of using appropriate temporal spacing when applying a discretisation to a continuous time scale”. But I felt the above was a touch more catchy.

There’s been a fair amount of noise in the media recently about 3D printers and the exciting possibilities which they present: here’s a video of our resident compugeek Stevie G building a 3D printer in 24 hours, and a lovely video of some chaps at EADS innovation printing a 3D bike.

These printers are based on the very simple principle that a 3D object can be built from a series of 2D slices. Each new slice sits ontop of the previous one and as the slices all stack together in succession, the object forms.

It is exactly this principle which froms the basis of a time-marching algorithm, which is often used in modelling dynamic or evolving systems on a computer.

By chopping up the time period under consideration into lots of tiny slices (or steps), you can build up a solution by calculating what happens at the next step based on the system at the current slice. “Stacking” these solutions together leads to a dynamic and evolving model.

As every mathmo or physicist who’s ever done one of these time marching computer simulation will tell you, choosing a time step (which we call \delta t) which is small enough to allow you to capture the solution, but large enough to be computationally viable is lesson 1 in numerical modelling of dynamic systems. The idea is exactly the same for 3D printing: choose a slice thickness that’s too big and you’ll miss all the detail in your object, choose one too small, and your printer will needlessly be working away for hours.

Choosing the right time step size becomes especially important when one deals with messy non-linear and chaotic systems, such as the one I’ve been looking at recently. In that case, choosing too large a \delta t and you don’t just miss some detail – you can easily get knocked on to a completely different solution path altogether.

Demonstrating this concept is the motivation behind a quick visualisation I did on Monday, which shows just how far off the answer you can get if you pick too large a time step.

The model in the video is of a system of retail centres in a city. Customers choose to shop at a given centre based on how big it is, and how far away it is from their home. As shoppers choose shops and spend money, the profits (and losses) of each retail centre are calculated, and the retail centres change their size accordingly.

Incidentally, the derivation and equations – for those who are interested – can be found in an overview of the model I wrote a few months ago: An overview of the Boltzman-Lotka-Volterra retail model.

Each centre is arbitrarily labeled (x axis), and the log of the corresponding centre size, \ln Z_j is shown along the y axis.

The four types of circles in the video relate to four different choices of \delta t: royal blue is \delta t  =0.25, light blue has \delta t =0.125, red \delta t =0.025 and gold takes the far smaller (and hence more accurate) \delta t =0.0025, so that by t=10 the four separate simulations have been through 41,81,401,4001 time steps respectively. The dots are different sizes only to help in the visualisation.

Because it demonstrates the point a bit better, I’ve deliberately chosen a simulation which finds one winning centre: the “Westfield dominance” case, as one of my colleagues calls it. This can be clearly seen in the gold and red simulations, where the winning centre increases in size fairly rapidly at the beginning of the simulation, and all the other chaps slowly die to \ln Z_j = -\infty.

These accurate red and gold runs behave nice and smoothly, deviate little from one another, and show very little jumping around, even at large times. Bear in mind though, that gold has 10 times the number of calculations as the red (because the time step is 10 times smaller) but doesn’t offer us any more information.

Compare this however to the blue guys, who behave well in the early stages – all the circles begin as concentric, suggesting that all \delta ts give the same results. However, as time increases, the light and dark blue circles with larger values of Z_j begin to deviate from the more accurate red and gold simulations, accelerating upward. As time increases further, the blue circles leave the gold and red all together and do not return. They continue their jerky behaviour and end up becoming infinite.

The red case also has 10 times the calculations of the royal blue (and 5 times that of the light blue) but in this case, certainly does give us a lot more information. Picking a time step too large in this case doesn’t just give us a less accurate version of the solution – it doesn’t give us the solution at all.

Anyway. Enjoy the movie:

Tags:
0

British Science Association Science Communication Conference

An early plug, then, for the British Science Association Science Communication Conference, taking place over the 25th and 26th of May here in London. The theme for this year is “online engagement”, featuring high-profile figures like Cory Doctorow talking about democratisation of technology, Sophia Collins talking about her wonderful “I’m a scientist” project, and slightly off-topic, Simon Singh discussing the well-known libel case launched against him by the British Chiropractic Asssociation.

Under the online engagement umbrella, I’ll be running a discussion session on science podcasting, featuring Ben Valsler from the enormously successful Naked Scientists, Elizabeth Hauke from DIY podcast Short Science, and Frank Dondelinger from the student-run Edinburgh Uni Science podcast. All of these panellists have unique perspectives about starting a new podcast, creating interesting and accessible content, finding a stylistic voice, and growing an audience, as well as the more technical aspects. Of course, being the live producer and a presenter on Sony Silver Award winning podcastAnswer me this!”, as well as live producer on UCLs Bright Club podcast and doing everything for my own music podcast, I will probably venture a few opinions over the course of the session.

The conference has a varied agenda, covering everything from engaging policymakers and diversifying your audience to using games and social media in science communication. The conference will take place on May 25th and 26th at King’s Place in London and you can register here.


Tags:
0

British Science Association Science Communication Conference

An early plug, then, for the British Science Association Science Communication Conference, taking place over the 25th and 26th of May here in London. The theme for this year is “online engagement”, featuring high-profile figures like Cory Doctorow talking about democratisation of technology, Sophia Collins talking about her wonderful “I’m a scientist” project, and slightly off-topic, Simon Singh discussing the well-known libel case launched against him by the British Chiropractic Asssociation.

Under the online engagement umbrella, I’ll be running a discussion session on science podcasting, featuring Ben Valsler from the enormously successful Naked Scientists, Elizabeth Hauke from DIY podcast Short Science, and Frank Dondelinger from the student-run Edinburgh Uni Science podcast. All of these panellists have unique perspectives about starting a new podcast, creating interesting and accessible content, finding a stylistic voice, and growing an audience, as well as the more technical aspects. Of course, being the live producer and a presenter on Sony Silver Award winning podcastAnswer me this!”, as well as live producer on UCLs Bright Club podcast and doing everything for my own music podcast, I will probably venture a few opinions over the course of the session.

The conference has a varied agenda, covering everything from engaging policymakers and diversifying your audience to using games and social media in science communication. The conference will take place on May 25th and 26th at King’s Place in London and you can register here.


Tags:
0

Rank Clocks: showing time as time

Showing data as a time series enables us to see “data paths” – to simultaneously observe past, present and future, and to begin to spot trends. However, sometimes overloading an already complex graphic with a persistent time series will make the graphic dense and unusable.

So what about representing time as time? Represent the passage of time via the updating of the graphic – essentially some form of animation (obviously, this isn’t possible for a static graphic, e.g. in a printed article). To my eyes, this can have the effect of accentuating dynamics and, by mapping time onto time, sometimes giving a more realistic sense of the process occurring, especially when related to spatial flows.

The example below is drawn from Mike Batty and Ollie O’Brien’s excellent recent work on Rank Clocks. The concept behind a rank clock is that of a polar time-series of “rank”. So, the time axis proceeds in a circle from 0 to 360 degrees, and the distance from the centre charts the value being measured. In a Rank Clock, the value being measured is the rank of something in a set of similar objects – so in the example below, the top 20 Japanese cities (by population) are plotted by rank. The lower the rank, the larger the rank value, and the further it appears from the centre – so Tokyo is 1st and appears close to the centre, the city which is 20th appears 20 times further from the centre.

Given that Rank Clocks have the word “clock” in the description, I wanted to animate it to map time onto time. As I said before, this has the effect of accentuating some of the fast dynamics – and it becomes immediately obvious how stable the biggest cities are well out of the long tail. The software offers tools for the user to select the rank range of interest (they can view all 300+ cities if desired, but messiness ensues), at what timepoint the colour scheme is decided, the speed of the sweep, and the background alpha value (see below).

The way the tool works is to draw a small circle for each data point in each frame, representing the rank value at each timepoint. In order to calculate the rank values between the actual datapoints, a simple linear interpolation is employed. So no actual data lines are drawn – what appear to be lines are just overlapping ellipses.

Setting the “alpha” value sets the transparency of a black box drawn over the whole data pane at each refresh – set this to a high value and the frame is refreshed completely each time, showing dots which represent the data at the current time; set this to zero, and each previously drawn frame will exist indefinitely, eventually plotting out the entire time behaviour. Intermediate values give the data points a comet-like tail, showing their recent past clearly, and their distant past dimly. These animations look like a frogspawn race, or possibly a cohort of sperm circling impotently around a central, elusive egg, and I rather like the aesthetic effect they produce.

This was written in Processing (with OPENGL and the ControlP5 toolbox to create control sliders), and this “alpha wipe” technique is one that’s very easy to use to to create these smooth transitions. Don’t use a “background(colour)” method – instead, at the top of your “void draw”, just add

fill(0, 0, 0, alpha);
noStroke();
rect(0, 0, width, height);

Setting alpha to the value you want. As I said, I allow user control of this variable. I use this technique A LOT for smoothing frame-to-frame transitions.

 


Tags:
0

Rank Clocks: showing time as time

Showing data as a time series enables us to see “data paths” – to simultaneously observe past, present and future, and to begin to spot trends. However, sometimes overloading an already complex graphic with a persistent time series will make the graphic dense and unusable.

So what about representing time as time? Represent the passage of time via the updating of the graphic – essentially some form of animation (obviously, this isn’t possible for a static graphic, e.g. in a printed article). To my eyes, this can have the effect of accentuating dynamics and, by mapping time onto time, sometimes giving a more realistic sense of the process occurring, especially when related to spatial flows.

The example below is drawn from Mike Batty and Ollie O’Brien’s excellent recent work on Rank Clocks. The concept behind a rank clock is that of a polar time-series of “rank”. So, the time axis proceeds in a circle from 0 to 360 degrees, and the distance from the centre charts the value being measured. In a Rank Clock, the value being measured is the rank of something in a set of similar objects – so in the example below, the top 20 Japanese cities (by population) are plotted by rank. The lower the rank, the larger the rank value, and the further it appears from the centre – so Tokyo is 1st and appears close to the centre, the city which is 20th appears 20 times further from the centre.

Given that Rank Clocks have the word “clock” in the description, I wanted to animate it to map time onto time. As I said before, this has the effect of accentuating some of the fast dynamics – and it becomes immediately obvious how stable the biggest cities are well out of the long tail. The software offers tools for the user to select the rank range of interest (they can view all 300+ cities if desired, but messiness ensues), at what timepoint the colour scheme is decided, the speed of the sweep, and the background alpha value (see below).

The way the tool works is to draw a small circle for each data point in each frame, representing the rank value at each timepoint. In order to calculate the rank values between the actual datapoints, a simple linear interpolation is employed. So no actual data lines are drawn – what appear to be lines are just overlapping ellipses.

Setting the “alpha” value sets the transparency of a black box drawn over the whole data pane at each refresh – set this to a high value and the frame is refreshed completely each time, showing dots which represent the data at the current time; set this to zero, and each previously drawn frame will exist indefinitely, eventually plotting out the entire time behaviour. Intermediate values give the data points a comet-like tail, showing their recent past clearly, and their distant past dimly. These animations look like a frogspawn race, or possibly a cohort of sperm circling impotently around a central, elusive egg, and I rather like the aesthetic effect they produce.

This was written in Processing (with OPENGL and the ControlP5 toolbox to create control sliders), and this “alpha wipe” technique is one that’s very easy to use to to create these smooth transitions. Don’t use a “background(colour)” method – instead, at the top of your “void draw”, just add

fill(0, 0, 0, alpha);
noStroke();
rect(0, 0, width, height);

Setting alpha to the value you want. As I said, I allow user control of this variable. I use this technique A LOT for smoothing frame-to-frame transitions.

 


Tags: