Urban ARK Stakeholder Workshop Report: Niamey, Niger


English
Niamey Workshop Discussion

The workshop was attended by 30 diverse stakeholders including lecturers UAM in Niamey, Researchers, PhD researchers, Save the Children representatives, state services, NGOs and other Associations. The goal of the workshop was to (i) review the Status of research in Niger (ii) define operational indicators for the research project focused on flooding in Niamey. Following Dr. Soumana Boubacar’s (UAM lecturer and Urban ARK research lead), SCI Representative Deborah Taylor thanked the participants for attending. The opening remarks of the workshop were delivered by Prof. Marichatou Hamani, Vice Dean of the Faculty. After lunch, Alex Barcena (PhD Researcher, King’s College) introduced Urban ARK’s international dimension. This intervention was followed by that of Dr. Dan Tankari Badjo Aderhamane who discussed pollution risks in Niamey.

In addition to these presentations,a Skype communication was conducted by Professor Mark Pelling (Urban ARK Principal Investigator, King’s College London). Participants were moved by the encouraging words of Prof. Pelling who also used the opportunity to raise important questions. The next presentations focused on the research project led by Dr. Soumana Boubacar in collaboration with King’s College and Save the Children International. These presentations were followed by group work which allowed all stakeholders to contribute to the definition of indicators to measure the components of stability at the household level, particularly in relation to children and to discuss these a group. Finally, the Vice Dean gave the closing speech and thanked all participants and highlighted the significance of the project.

Standfirst: 

Urban ARK Niamey partners held a stakeholder workshop on 13 June 2016 at Abdou Moumouni University.

Posted in Uncategorized

Better regulation through ‘Big Data’?

This Thursday 23rd June, Alex Griffiths from the School of Management & Business will give a seminar on the use of ‘big data’ in regulating public service provision. 

Better regulation through ‘Big Data’:
A triumph of hope over reality?

 Alex Griffiths, School of Management & Business, King’s College London
10:30-12:30, Thursday 23 June 2016
Room K1.26, King’s Building, King’s College London, Strand, London, UK

‘Big data’ enthusiasts often claim that data analytics is the key to better regulation and improved public service provision. By harnessing the power of big data, regulators can identify those service providers at greatest risk of non-compliance and target their interventions accordingly. This promises both to concentrate regulatory efforts where improvements are needed most, while freeing others from unnecessary scrutiny.  Whilst such data-led approaches have been widely adopted in the private sector, whether in credit scoring loan applicants, or recommending similar products to online shoppers, to what extent can they be successfully extended to the regulation of public services?

This seminar evaluates two extant data-driven approaches to regulating healthcare quality, before assessing whether machine-learning techniques can provide a more effective means of targeting regulatory resources in health and higher education. The presentation concludes with a discussion on the preconditions necessary for a successful ‘big data’ approach.


Directions: From main Strand reception, go straight ahead down the corridor. Turn left into the East Wing corridor just after the vending machine, the following rooms are up the small staircase to your immediate right: K1.26 (21B): King’s Building

Posted in Uncategorized | Tagged

El Geografico 2016

BA squad

BA squad

 

Bsc Squad

Bsc Squad

By Nick Burgess, Year 3 BSc student.

Historical Geographer David Livingstone in his 1992 classic “The Geographical Tradition” wrote of Geography as a contested discipline. On the 24th May 2016, Livingstone’s historical contestation was brought to life through a glorious encounter between geographies biggest rivals, BA and BSc. With England preparing for the European championships, lecturers jetting off to European conferences, third year students (one PhD student and a particularly young looking lecturer) had undergone weeks of fartlek training in Hyde and Holland Park. Blessed with a break in the rain, students of third year BA and BSc took to the field of Wormwood Scrubs in a quest to get their hands on the globally renowned ‘Hulme Cup’, a prize worth more than a fully funded NERC or ESRC grant, rumoured to lead to geographical immortality at King’s. Whilst this contest undoubtedly sought to provide bragging rights for one of the cohorts, the event was organised in aid of the National Autistic Society, raising an incredible £160 over 90 minutes. Continue reading

Analysing Drone Data: 3D forest point clouds

In this guest post from King’s Geography PhD Student, Jake Simpson describes some of his geocomputational work analysing data from tropical peat swamp forests to estimate carbon emissions.

In December 2015, I travelled to an area just outside Berbak National Park, Sumatra.  This area comprises both pristine and impacted tropical peat swamp forest, which is one the world’s most important terrestrial carbon stores, and home to a small, endangered population of Sumatran tigers.  The area was heavily impacted by wildfires between July and October, which made headlines across the world when the pollution from the fires reached as far as Vietnam!  In total, about a fifth of the area burned, and in the process, emitted a globally significant amount of carbon into the atmosphere.

Haze_small
A NASA satellite image showing the extent of the haze on 24 September 2015 (Public Domain)

Quantifying the carbon emitted is a tricky business because it cannot be measured directly.  One way to estimate the emissions is to measure the amount of peat that burns away.  Using digital elevation models (DEMs), the volume of peat burned is estimated by subtracting the post-burn DEM from the pre-burn DEM.  We have access to a pre-burn DEM from an airborne LiDAR survey for an area of peat swamp forest.  With some clever filtering, the ground level can be extracted from the LiDAR data, even when dense forest is present.

We then used a very cheap unmanned air vehicle (UAV) with a camera strapped to the bottom to survey the post-burn area to extract a DEM.  This technique is called structure from motion (SfM).  For a given area, multiple photos are taken from different angles and then loaded into software called “Agisoft Photoscan”.  The software uses photogrammetry algorithms to identify common points between photographs, and aligns them.  Other algorithms compare the location of these common points in relation to each other, and in doing so reconstructs a 3D point cloud of the surface.  This process is incredibly computer-intensive and can take several days to complete, especially when up to 1,500 photos are used per survey. The steps I took in the analysis are summarised below.

Overall, I processed 8 UAV surveys, which equates to over 8,500 photos and over 2.5 billion point cloud data points.  Thanks to the Geocomputational hub, I was able to process these photos and am in the process of writing up the analyses for a paper. Stay tuned…

Step 1:  Photos are aligned, camera positions are predicted (blue), tie points detected.

Setp1a

Step 2:  Identify ground control points (with coordinates measured in the field) in the photos for georeferencing purposes

Step2

Step 3:  Build dense point clouds, DEMs, orthomosaic photos.  Here is a before and after shot of the forest we surveyed.