The ultimate survival kit for your spatial data

survival_cover_normal

“Ubi amici, ibi opes: Where you find friends, there you’ll find riches.”
Plautus, 200 BC

“Where you find metadata, there you’ll find data.”
Antonius Mathus, AD 2014

Research is fundamental to all disciplines in academia and data output is often the result of this endeavour. Most universities view research data as a valuable asset that requires a management strategy to promote and support long-term data curation, preservation, access and re-use.

Universities need the resources to tie together the policies, infrastructure, tools, processes and training to support research data management. The Joint Information Systems Committee (Jisc) has played a key role in providing these resources to many universities through a range of programmes including the following:

  • Repositories and Preservation Programme, which provided an investment of £14 million in Higher Education repository and digital content infrastructure.
  • Information Environment supporting digital repositories and preservation, including cross-searching facilities across repositories; funding for institutions to develop a critical mass of content, preservation solutions and advice for the development of repositories.
  • Jisc Managing Research Data (JiscMRD) programme, which supported UK academic institutions in their efforts to develop internal research data management policies to ensure data re-use.

The GoGeo service is another example of the Jisc commitment to UK academia to provide resources to securely manage and share research data that have a geographical (spatial) component. The free service offers the following resources for managing research data:

  • Geodoc metadata editor tool, which allows users to create, edit, store, import, export and publish standards-compliant (ISO 19115, UK GEMINI, INSPIRE, Dublin Core, DDI) metadata records;
  • GoGeo portal, which offers users the option of publishing their geospatial metadata records to public or private catalogues, the latter for those who want to control and restrict access to information about their spatial data;
  • ShareGeo, a repository for users to upload and download spatial data; and
  • geospatial metadata workshops, which use presentations and hands-on practicals to introduce attendees to geospatial standards, metadata, geoportals and the GoGeo service.

The ultimate survival kit for your spatial data is a guide that provides a concise overview of these GoGeo service resources which can serve as a complement to your current research data management practices if your datasets have a spatial component. This guide also shows how the GoGeo service resources can be used to manage your spatial data information (metadata) and share it with your project colleagues, or with researchers and students in your department or academic institution.

You’ll discover that

  • it’s much easier and more efficient to use Geodoc to create and export a metadata record to bundle with its spatial dataset than it is to send the dataset without any information to a colleague who might return with questions. Your colleague can also import your metadata record to Geodoc to update if edits are made to your shared dataset.
  • it’s much easier and more efficient to use Geodoc to create and publish metadata records to a private research metadata catalogue on the GoGeo portal than it is to send bundles of metadata records or spatial data information to fellow researchers.

The ultimate survival kit for your spatial data document offers more in detail about the possibilities, the potential that the GoGeo service has to offer for spatial data management and sharing, whether at the personal level, amongst trusted colleagues or visible to the world if you have no further need of your spatial data and wish to share it with others who could benefit from your research endeavours. There could be others who have data that could benefit your research as well?

Please contact me to request a copy of this guide. The guide will include a questionnaire, and if you answer the 10 questions, you will receive a GoGeo-Geodoc coffee mug filled with chocolates. There is nothing to write other than your name and address; each question can be answered with the tick of a box.

geodoc_mug

Thank you very much.

Tony Mathys
Geospatial Metadata Co-ordinator
EDINA
The University of Edinburgh
160 Causewayside
Edinburgh EH9 1PR

My Desk tel: (0)131 651 1443
EDINA Help Desk tel: (0)131 650 3302

email: tony.mathys@ed.ac.uk

An electronic version of the The ultimate survival kit for your spatial data guide can be found on the GoGeo portal’s Geodoc login page at http://www.gogeo.ac.uk/gogeo/metadata/geodoc.htm

 

 

Posted in Data, Inspire, JISC, Research | Leave a comment

EDINA GeoForum 2014

EDINA hosts an annual gathering for it’s GeoServices with an aim to connect with users from institutions from around the country.  This years event was held in Edinburgh on the 19th June.  The event went well and there was a buzz around the informatics forum venue.  I don’t really want to provide a summary of the event as there is already a great summary on the Digimap Blog and if this doesn’t provide enough detail, the live blog transcript should (reps to @suchprettyeyes for the live blog – no idea how she can record everything in real-time).

What i would like to do here is to discuss a couple of topics that seemed to surface during the day.

Know your users

Who uses GIS data?  Geographers of course is the obvious answer, but the use of geospatial data is now much wider than just earth and environmental science.  EDINA has recognised this for some time and has worked hard to make it’s service interfaces as intuitive as possible.  In addition, there has been a conscious decision to promote best practice through the interfaces and to use the correct language so that users actually learn about GIS and geospatial terms just by using the services.

Geoservice Personas

Geoservice Personas

Geoforum provides a vital link between the service team and users.  It is our chance to speak to users directly and for users to provide feedback on what they like, what they dont like and what they would like to see in the service.

Turning Data into Information

Some users want to get their hands on the raw data so that they can use it as basedata for their own analysis, others prefer to receive a polished product that will add value to their coursework or research.  EDINA‘s geoservices tries to accomodat such diverse user needs.  The role of many geospatial professionals is to take data and turn it into useful information.

PG_Tips

Data and Information

This message was echoed by keynote speaker Peter Gibbs of the UK Met Office.  Peter eloquently demonstrated the vast number of data sources that fed into our weather reports. The meteorologists job was to take this data, analyse it, produce a best case scenario and present this in an easy to understand format accessible to the general public.  The public don’t really care how you created the forecast, they just want to spend less than 2 minutes finding out if they need to take a brolly to work.  This encapsulates much of the geospatial industries role, turning data into usable information which can inform decisions.

 Connected systems and data

Everything is linked. Virtually nothing can be considered in isolation.  This means that many users will be consuming geospatial data from EDINA and combining it with other datasets.  EDINA has recognised this and has started to connect some of its collections in Digimap.  For example, you can create an annotation in one collection and then access it in another.  This allows users to map historic features, or trace geological features and visualise these on modern OS maps. But we are now thinking about taking this further and investigating how to overlay data from one collection in another.  There is a bit of work to be done here but it could open things up.  Why stop at just overlaying EDINA Digimap data in other Digimap collections? Would it be useful to be able to overlay external feeds from organisations such as the Environment Agency or SEPA in Digimap Roam?

Mobile

The rise of the smartphone seems unstoppable.  Almost everyone has one and we are increasingly accessing web services through our mobiles.  Fieldtrip GB is a free app from EDINA that runs on Android and iPhone and allows users to collect data on their smartphone.  What does it do?

  • good, clear cartography, just as you would expect from EDINA’s geoservices team
  • users can design their own data collection forms that suit their needs.
  • the app is designed to work in “offline” mode meaning you can pre-load maps and dont require a 3G signal to use it in the field
  • exports data to csv, kml and geojson
  • did i mention it is free!
FtGB

Fieldtrip GB

In addition to Fieldtrip GB, EDINA is working on a GoGeo app which will help people keep up to date with geospatial news and events as well as allowing users to discover data while on the move.

What’s on the horizon?

The geoservices team are constantly updating and upgrading services.  Some of this work is invisible to the user as it is backend stuff. Optimising databases, improving searching and just making sure the services are as fast and reliable as possible.  But there are a number of exciting projects that should offer users new functionality over the next year.  The easiest way to find out more is to flick through Guy McGarva’s forward looking presentation.

Posted in Conference, Digimap, EDINA, GIS | Tagged , , , , , , | Leave a comment

GeoBusiness 2014 – review

GeoBusinessA couple of weeks back I attended the first GeoBusiness conference in London.  It was an interesting event and I have been meaning to write up my thoughts on it but keep getting snowed under with last minute jobs.  I have finally managed to clear some time and can report back to you all what happened at the event.

I decided to go to the conference to see what the public and commercial sectors were working on and what they thought should be the current focus for the GI sector.  Neil Ackroyd, the Acting Director General and Chief Executive of the Ordnance Survey opened proceedings by summarising the view of the sector from the main data provider’s perspective.  Condensing his talk to a few key points I would say the OS were focusing on networks (in terms of geographical networks such as rivers, railways and paths) and collaboration.  They are increasingly working directly with organisations to deliver bespoke data that can be used to support large building infrastructure projects, or for events such as the Olympics.  The OS are currently working on hosting data in the cloud, essentially having unstructured data that is accessible to users.  Storing the data as “unstructured” means that you can apply structure as it is accessed and tailor this to the clients needs.  The advantage is that you have one definitive source rather than multiple versions that are subtly different but which al require maintaining.  Neil closed with two take-away thoughts:

  1. Know your market
  2. Simplify things for them

After a short coffee break I attended the Making data Deliverable Strand.  The first talk of the session was given by Paul Hart (Black & Veatch) who discussed the use of GIS visualisations to convey complex information to the public. The examples centred around flood alleviation schemes where different scenarios and their resulting benefits, could be presented in an interactive way.  The use of 3D views that used true colour aerial image back drops allowed non-geo experts to engage with the data.  The output summarised several hundred model scenario runs in an easy to digest way. I did have a couple of issues with the visalisation, the first being the use of red and green which, while intuitive in terms of good/bad,  would not be particularly colour blind friendly.  The visualisation didn’t really convey uncertainty.  Including uncertainty would possibly complicate the visualisation, but the public may incorrectly assume that the flood outlines were accurate rather than the best estimate from modelling.  I questioned Paul about this and he explained that the maps were presented to a closed audience with experts on-hand to explain them.  He agreed that displaying uncertainty on such maps could over-complicate them.

This was followed by another talk focused on visualising data. Lingli Zhu from the National Land Survey of Finland demonstrated the work they had been doing to visualise landscapes using the Unity game engine.   Unity has been used in popular games such as Gut and Glory, but can be easily adapted to produce realistic simulations and can help users visualise environment change.  However, Unity does not allow user to specify a real-world geographic reference frame which means any geographic data has to be shoe-horned into the virtual world.

The second part of the session focused on BIMs.  BIMs (Building Information Modeling) have been the subject of several events over the past couple of years and they seem to make sense, but they seem to span

First up was David Philip, Head of BIM Implementation at the Cabinet Office.  David gave a great overview of BIM implementation with a presentation that was peppered with light humour.  David detailed the “3 tribes” living in the BIM World: CAD users ?GIS users and BIM users.  BIMs should be an open, shareable asset that unites CAD and GIS users. David pointed out the importance of BIMs throughout the life of a building as the cost of building (capex) is much smaller than the cost of running or operating (opex) a building.  Therefore, the BIM is a critical tool in maximising the efficiency of a building throughout its lifecycle and should aim to be an “open shareable asset information system”.

BIM

BIM Task Group

David closed by pointing out that we often suffer from “Infobesity” and we should better understand which data we need to retain and which we can get rid of.  Keeping everything is just not a sustainable approach.

The second two presentations in this session provided insight into actually implementing BIMs in the commercial sector.  Peter Folwell (Plowman Craven), Matthew McCarter (London Underground) and Casey Rutland (Arup) gave honest opinions of the highs and lows of working with BIMs.  The consensus from these presentations was to implement a BIM early rather than as an after-thought that ticks a box. Setting up a BIM early will allow the project to reap the benefits in terms of organisation, data flow and cost savings.  Also, 3D scanning seemed to be seemed to be at the heart of the BIM but this should not be seen as a one-off task, regular scanning can help partners visualise the evolution of a project and help identify potential issues.  However, multiple scans need not man multiple BIMs, just add them to the existing BIM.  One aspect that surprised me was the strength of the BIM community on social media.  There seems to be an active community lurking in Twitter that are happy to share best practise and offer general advice.  Just search for hashtags such as : #ukbimcrew / #laserscanning / #pointclouds.  If you want to find out more about BIMs then look at the BIM Task Group website

After lunch I attended the Global Trends session which had a wide range of talks from legal issues surrounding geospatial data, to downstream service opportunities from remote sensing data.  Ingo Baumann discussed the legal constraints surrounding geospatial data, focusing particularly on open data licences and issues around personal data.  One of the key problems is a lack of consistency between countries.  Google has discovered this publically while rolling out StreetView across Europe.  There is no specific geospatial law, but it is coming.  Until then, I will be keeping an eye on useful blogs such as Spatial Law and Policy.

Carla Filotico (SPRL) highlighted the value of remote sensing data and the downstream service opportunities.  The Argi business could benefit hugely from data from new satellites such as Copernicus and it is estimated that this is worth €2.8 billion market in the EU.  For more information on the Copernicus mission and its recent launches of Sentinel satellites, please refer to the ESA website.

The final session I attended was on Survey operations and system integration.  The first talk by Dipaneeta Das was well delivered but I felt it was pitched at the wrong level. Much of the time was spent explaining web mapping but I suspect nearly all of the attendees already knew about the advantages web mapping offers for disseminating information to the public.  The other two talks were really interesting and focused on data acquisition.  John McCreedy (IIC Technology) walked the group through the pros and cons of various survey techniques including Laser Scanning, Lidar and structured light (think Xbox Kinect). One interesting snippet that came out was that often aerial photography captured more detail than other “newer” techniques.  This sentiment was echoed by James Eddy (Bluesky) who continue to collect hi-res aerial photography of the UK and beyond.  You can even collect aerial images at night.  Why you might ask?  Well to capture information about light pollution and to monitor “dark spots” in cities. This information can then feed into spatial analysis on crime and anti-social behaviour helping the police and councils target resources.

BlueskyNight

Bluesky’s Night Aerial Images – courtesy of Bluesky

The takeaway message from this session was that clients are increasingly specifying technology when commissioning surveys. This may not be wise and it is often better to specify what they expect as a final product and leave decisions on which technology to the experts who will ensure that the most appropriate technology is selected.  I suppose that is, and always has been, the role of the expert in any field.

Summary

GeoBusiness 2014 seemed to be a success.  The talks were interesting, the audiences engaged and you could see that there was a whole heap of networking going on.  I will write a more detailed post on how I see this event in terms of the academic sector, but it just remains for me to thank the conference team for putting together a great event.  I am looking forward to GeoBusiness 2015.

Posted in Conference | Tagged , , , , , , , | Leave a comment

FME World Tour 2014

Another great guest post, this time by 2 of EDINA’s geodata team James Crone and Mike Gale. James and Mike attended the Edinburgh leg of FME’s 2014 World Tour which was held at Our Dynamic Earth on Thursday 15th May. EDINA use FME through Safe Software`s FME Grant Program.

FMEThe day consisted of a series of presentations covering new features of the latest 2014 release of FME and how FME is being used locally within Scotland and the UK. The quality of the presentations was very high being pitched to a technical audience and presented by an enthusiastic set of presenters who in many cases were not afraid to start FME Workbench up, build/edit geoprocessing workspaces up and then run them live in front of an audience. In doing so brilliant tips on how to use FME Workbench more efficiently were demonstrated. There was also a lot of audience participation to break the formal presentations up including an FME Cool Wall and the FME Quiz, more on which later.

Of the presentations, our highlights were:

Managing the Angus Council back-office and supporting the GI infrastructure with FME

During this talk, the presenters from the Angus Council GIS team, who introduced themselves as sharks with lasers, demonstrated the wide use that FME had been put to within a Scottish local authority, Angus Council. Through some FME workbench wizzardy FME processing flowlines were used to help with the planning process (applications for Wind Farms) and harmonising LLPG (Local Land and Property Gazetteers) data. One great quote that came from this presentation was that FME allowed Angus to provide “A single version of the truth” – which if you have ever worked in a local authority you will completely understand!!!

FME`s MapnikRasterizer makes happy cartographers.

Mapnik is an open source map renderer initiated by Artem Pavlenko and tiles rendered through Mapnik provide the default layer in OpenStreetMap. We`ve been using Mapnik for some internally within EDINA to render geospatial datasets directly from Python without the need to go to the trouble of firing up a GIS application. In this talk David Eagle from 1spatial ran through the features of the new FME MapnikRasterizer. The FME MapnikRasterizer is an FME transformer which can be dropped into any FME Workbench Workspace and used to create a map rendering of features being processed which is pretty cool. Combining this with other transformers to create tilesets makes things even more interesting. One of my pet hates is having to manually set up styles using GUI`s, it being more efficient to do so in an external file, with this in mind one of the things shown during the presentation is that the MapnikRasterizer can be supplied with the sets of styles to be rendered coded up in an external spreadsheet which is neat.

Slides: http://www.slideshare.net/SafeSoftware/fmes-mapnikrasterizer-makes-happy-cartographers

FME process optimisation, an exercise in best practice at the Ordnance Survey

In this talk, David Eagle talked about how FME technologies sit at the heart of the data update/verification process used by the Ordnance Survey to keep MasterMap up to date and how they`ve been able to optimise the processes to make things more efficient. This included some best practices were are shown in this set of slides:

Slides: http://www.slideshare.net/SafeSoftware/best-practices-in-fme-2014

BIM – Building Information Modelling

 

While not included directly as a presentation – it is clear that one of the hot topics at the moment with FME is BIM. Several of the talks referenced BIM and indicated large expected future use. As the UK Government is planning to adopt BIM as a data standard in 2016 the demand for BIM data is going to explode over the next year. Speaking directly with one of the guys from 1Spatial – FME can currently read BIM data but not write to the format. This is all about the change with a BIM writer currently being designed and a beta release is scheduled for September. So currently it’s a case of watch this space

UK BIM Task group: http://www.bimtaskgroup.org/

If the UK leg of the FME world tour get around to uploading presentations of the event this is where you can find them: http://worldtour.safe.com

Outwith the formal presentations, the wonderful 1spatial people ran 2 sessions to generate audience participation – the FME Cool Wall and the FME Quiz.

Most people should be familiar with BBC Top Gear`s Cool Wall where Jeremy Clarkson et al place a picture of a new sports car on a wall divided into sections indicating how cool the car is from uncool to cool to subzero. Well at the FME World Tour, the audience split into 4 groups, each group came up with 3 new features of FME or how FME was being used and these were then added to the FME Cool Wall.

Across the groups the ability of FME to perform complex geoprocessing without any need to write code was a resounding subzero coolness although at the same time the sometime bewildering number of transformers available in FME Workbench and knowing which one to pick when 2 or more seemed to do similar things was uncool.

The day finished off with the FME Quiz in which a series of multi-choice questions on all things FME were shown and the audience had to reply via email on their smartphones. EDINA won the prize for the first question as we twigged early on that setting up an email in GMail so that we could quickly submit our answer was a good strategy. As it was the first question, Mike and I got the first look at the prize swag on offer and grabbed a pair of FME World Tour 2014 t-shirts in a lovely shade of olive green with an FME dirgable on the front and a series of tour dates listed on the back.

So overall a very useful and extremely useful technical day and thanks to the highly enthusiastic 1spatial team for all the insights into FME.

Posted in Uncategorized | Tagged , , , , , | Leave a comment

GeoBusiness 2014 – a preview

GeoBusiness_smallGeoBusiness 2014 is less than a week away.  This is a new event and I am looking forward to seeing what it will be like.  The organisers have certainly pushed the event, with short magazine inserts listing who is exhibiting and presenting.  GoGeo will be there and i thought i would explain why we are attending and what we hope to get out of the event.

It’s new and it’s big

Pretty self-explanatory, but also significant.  This is a chance to speak to all the major software vendors and find out what enhancements they have in the development.  In addition, there are a host of companies that offer GI service.  I want to see what these are up to and report on what looks innovative and interesting.  These companies collectively employ a significant number of GIS graduates each year.  Many of them are exploiting new and emerging technology such as unmanned aerial vehicles (UAV).  As such, they are really quite dynamic places to be employed as a fresh-faced graduate.

Workshops

There are a number of interesting workshops being run by companies to highlight what innovative analysis they are doing. There seems to be a clusters of workshops around 3D laser scanning, UAV’s and Business Information Modeling (BIM).  There is also a strand that focuses on professional development.

Content

Content for GoGeo and perhaps even ShareGeo.  So that means news articles, blog posts and so on for  GoGeo.  With ShareGeo it would be great to get some sample data from some companies so that lecturers could use this in their lessons.  I will be looking to convince some of the UAV and scanning companies to give some data with ShareGeo.  If you don’t know what ShareGeo is, it is a repository for open geo-spatial data that enhances teaching, learning and research.

So if you already have a ticket I might see you there. If you don’t have a ticket, there is still time and there are special rates for students (£25 per day if you pre-book).  Students, do your research on the companies attending and speak to people to find out what they do, it is a great opportunity to see the diverse range of jobs that is available in the GI market.

Geobusiness 2014 website

Posted in Conference | Tagged , , , , | Leave a comment

NHS Health Atlas – risk and disease

risk

Risk of Melanoma – from BBC and Imperial College London

NHS Choices have published a health atlas that maps the risk of a number of illnesses across England and Wales. The research behind the map, which compiles data from over 25 years, was carried out by Imperial College London.

The Data was collected between 1985 and 2009 from the ONS and from cancer registers. The 11 diseases and conditions that have been mapped are:

  • Lung cancer
  • Breast cancer
  • Prostate cancer
  • Malignant melanoma
  • Bladder cancer
  • Mesothelioma
  • Liver cancer
  • Coronary heart disease
  • COPD mortality
  • Kidney disease
  • Stillbirth
  • Low birth weight

A cursory glance at the map will reveal expected trends such as the risk of skin cancer being higher in the South-East where there is more sunshine and higher risks of lung cancer coinciding with larger cities where airborne pollutants are more likely. However, i am sure that there are other interesting observations that could be extracted if you have time to explore the data.

You can explore some of the data on the NHS Choices website and read about it on the Independent and the BBC website.

I will try to find the data and post it in ShareGeo, but until then you might want to explore this dataset that shows death related to air pollution.  I really need to get some happier datasets into ShareGeo!

Posted in Data, GIS, Research | Tagged , , , , , , , , | Leave a comment

GISRUK 2014

glasgow

Glasgow – willsnewman (flickr)

Jane Drummond opened the 22nd conference and explained that Pink was the colour of the conference, hence the helpers were wearing Pink T-shirts. This also might explain the pink umbrellas last time GISRUK visited Glasgow.

Wednesday

Mike Worboys keynote gave “A Theoretician’s eye view of GIS Research”. He highlighted the dramatic fall in the proportion of GISRUK papers that covered the theoretical side of GIS. He mused that perhaps we had covered it all; in the end he highlighted several areas where there was still much theory to be discussed, including Geo-Semantics and Geo-Linguistics.

In The Urban Environment session chaired by Peter Halls we saw William Mackaness talk about Spacebook, a system of delivering directions via audio as users encountered various way points on a route. The research found that using Landmarks gave better results than street names in terms of getting someone from A to B.

Phil Bartie, who was a researcher on William Mackness’s paper delved deeper into the issue of Landmarks. He was using images to find out what people identified as landmarks and was analysing them semantically and spatially to distinguish related and unrelated features. His use of Trigrams, or groups of three words may well be a solution to issues with obtaining good search results from EDINA’s place name gazetteer.

Nick Malleson was next talking about using tweets as a proxy for ambient population. Despite the issues with the quality and bias of the Twitter data he found that it still overcame the problems of using census data for city centre population when assessing crime rate. The peaks seen in crime rate for the main shopping and socialising areas disappeared as they were adjusted for the number of people present rather than the number actually living there. Outside of these areas, crime rates were still high in areas where there were social problems as shown by using census data.

The use of Twitter in research continues to raise interesting questions about sampling validity and ethics, this would continue into the second day.

Thursday

Thursday as the only full day in this years GISRUK program and had 3 parallel sessions.

Spatial Analysis: the best 2 talks being really quite different. Georgios Maniatis discussed error quantification and constraints in environmental sensors.  Georgios’ was looking at sediment movement in rivers, using a local reference frame offered accuracy improvements but added further complications, not least that a significant portion of the signal travel time was through water. Given the small distance from transmitter to receiver, errors could quickly become significant.

The other talk that stood out looked at visualising active spaces of urban utility cyclists. This was given by Seraphim Alvanides on behalf of Godwin Yeboah. Their analysis clearly showed that in certain areas of Newcastle the cycle infrastructure was mis-aligned with where cyclists actually rode. Cyclists used more direct routes to get to work and were more likely to detour on the way home to do shopping or other leisure activities. The fact that the Newcastle Metro which is operated by Deutsche Bahn, do not allow cycles onto their trains. In Continental Europe they seem more amenable to such integration.

Citizen Survey: This session looked really interesting and Neil Harris (Newcastle Uni) kicked off with a very interesting description of a heterogeneous sensor infrastructure which used a schemaless approach.  They had effectively decided to avoid XML and used key value pairs instead.  By using HStore they were able to hook things up with Postgres/PostGIS. The advantage of this approach was that they could integrate new sensors into the D’base easily by just adding key values to the main list. Key values may be seen as old hat by many, but with HStore it gives quite a flexible solution. The work is part of the Science Central project and will effectively pulls together all possible data feeds for the  Science Central to use.

The other presentation of note was by Robin Lovelace (Leeds) who invited discussion around the merits of twitter data in research.  This was not about the ethics around whether users knew what data they were giving-up, but more about the pro’s and con’s of using the data at all.

  • Con – unregulated data, unfocused, loudest voice dominates
  • Pro – diverse, low cost, continuous, responsive

Using Twitter data may raise the following questions

  1. Who made it? – the public
  2. Who owns it? – Twitter

As the discussion progressed it was mentioned that we may be in a golden age for social data, at the moment lots of people are providing information through social media and the social media companies like twitter are allowing us to use the info for free. At some point either the public will realise what info they are providing and seek to limit it, or the government will perhaps do so, and social media companies (who trade on information about users) may restrict access to data or try to charge for it.  Interesting and thought provoking.  If you want to find out more, look at Robin’s presentation and download his code from Twitter to set up a Twitter Listener.

Remote Sensing - I used to do remote sensing so i thought i would go to this session and see what was new. It turns out that it didnt have a huge amount of remote sensing in it, but there was a couple of gems worth mentioning. First is the work that Jonny Huck (University of Lancashire) is doing with sensors.  Jonny presented Map.me at last years GISRUK and it was good to see this being used in other people’s research, but the sensor work took a different direction. Jonny made a low-cost (£400) pollution monitoring kit that also monitored VO2 flux of users. This allowed him to crudely calculate risk of pollution.  It was simple kit using motes , smart phones and some basic gis for visualisation. I found it quite refreshing to see a simple approach taking off the shelf kit and running simple experiments. This will hopefully lead to discussion, refinement and some really insightful science.

The other presentation that i enjoyed introduced Whitebox – a geospatial analysis toolkit created by John Lindsay. This is an open-source GIS package and i was stunned by how many tools it had., over 370 at the last count! Possibly most impressive was the Lidar processing tool which will happily open 16Gb of raw lidar point cloud and allow you to process it. I dont know of another open source package which handles lidar.  John likes to call Whitebox open-access rather than open-source. Whats the difference? Well when you open a module there is a “View Code” button. This will open the code that runs the module so that you can see how it works and what it does.

Whitebox is relatively unknown, but John hopes to push it more and the audience suggested using GitHub rather than google code repository and to work towards OSGeo incubation.  It does look good and i have already downloaded it. Oh, it is a Java app so is easy to get working on any platform.

Plenary – I enjoyed the sessions and found something interesting in each one, but the plenaries were a bit underwhelming. Most conferences use the plenaries to bring everyone together and then get the big cheese’s out to show-off cutting edge research or to inspire the audience. The Thursday plenary didn’t seem to do this.

Friday – i was not able to attend on friday, sorry.

gisrukOverall – the conference was well received and i found some of the talks really interesting.  I would have like to be inspired by a keynote at the plenary and I hope that GISRUK 2015 in Leeds will use the plenary to motivate the group to continue to do great GIS research. Thanks to the  local team for pulling the event together, it is never an easy task.  You even managed to get the weather sorted.

 

 

Posted in Conference, GIS | Tagged , , | 2 Comments

Digimap for Schools adds historic map layer

DFS

Old and new

Digimap for Schools has added a new historic map layer to the popular online map service, extending its potential for use in schools across a wider spectrum of the national curriculum.

The new historic map layer features mapping from the 1890s and covers the whole of Great Britain. Teachers and pupils will be able to overlay the historic maps over current mapping and compare changes in the landscape in their areas and beyond.

Digimap for Schools is an online application developed by EDINA at the University of Edinburgh. It gives schools easy access to a wide range of Ordnance Survey mapping using a simple login and password. The service is available to all pupils regardless of age. It allows schools to access a variety of mapping scales including Ordnance Survey’s most detailed OS MasterMap and the famous OS Explorer mapping at 1:25,000 scale which is ideal for outdoor activity.

The historic Ordnance Survey maps have been scanned and geo-referenced by the National Library of Scotland (NLS)and made available in Digimap for Schools. The maps were originally published between 1895 and 1899 as the Revised New Series in England and Wales and the 2nd Edition in Scotland. The historic maps are high quality scans at 400dpi for Scotland and 600dpi for England and Wales. This means that they can be enlarged far beyond their original scale of 1 inch to 1 mile.
OSElaine Owen, Education Manager at Ordnance Survey, added: “This new layer in Digimap for Schools is a fantastic resource for teachers and pupils of all ages, especially if they’re working on a local history project. The historic layer is viewable against a range of modern map scales up to 1:10,000 scale. You can access the maps via a slider bar that allows the contemporary map to be gradually faded away to reveal the historic map. We’ are adding some new history and geography resources to accompany the layer, including looking at how coastlines have changed over the last 120 years.”
Pupils and teachers using Digimap for Schools can save and print maps at A4 and A3 size. The maps can be printed as a historical map, or combined with the modern map at different transparency settings as a merged image. The full set of annotation tools are available for use on the historic map, providing many opportunities to highlight changes.
Since Digimap for Schools launched in 2010, the service has been adopted by over 20% of secondary schools. 
NLSChris Fleet, Senior Map Curator at NLS said “Old maps present our history in one of its most enthralling forms. We are delighted to be collaborating with Ordnance Survey and EDINA in delivering our historic maps to schools through the Digimap for Schools application.”
Peter Burnhill, Director of EDINA at the University of Edinburgh said “Students, pupils and their teachers now have unrivalled access to the very best maps to gain rich understanding of how Britain’s landscape has changed in over a century. The result is endlessly fascinating, the skill and generosity of staff at the National Library of Scotland have enabled a real sense of place when combined with the Ordnance Survey maps of today’s Britain.”

Digimap for Schools is open to all schools in Great Britain via an annual subscription. The subscription costs £69 for a primary school and up to £144 for a secondary school.
Posted in Data, Digimap, EDINA, Historic, Learning, Schools | Tagged , , , , , , , | Leave a comment

Inaugural Scottish QGIS user’s Group

QGIS UK

QGIS UK

“Today we have a guest blog post from one of the Geo-developers at EDINA.  Mike works as part of the data team and is usually up to his oxters in databases ensuring that the data offered through Digimap is both up to date and in a useful format. Over to Mike.”

Following on from successful meetings In England and Wales, on 19th March I attended the inaugural “Scottish QGIS User Group” hosted at Stirling University. My first thought revolved around  the level of interest that such a meeting would acquire, but as it turned out, it was very popular. I was also surprised at the geographical spread of the attendees, with several folks coming from Brighton (Lutra Consulting) and  Southampton (Ordnance Survey) as well as all over Scotland & northern England. Although the attendees were dominated by public sector organisations.

Talks/Presentations:

A more detailed breakdown of the presentations can be found here: http://ukqgis.wordpress.com/2014/03/25/scottish-qgis-user-group-overview/

From my own perspective, the talks on developing QGIS and Cartography in QGIS were of particular interest – demonstrating the every growing potential of QGIS. Additionally, the improvements (particularly speed enhancements)  that look to be coming soon (as highlighted in Martin Dobias’ presentation) are impressive.

As for the user group itself, it will be interesting to see where it goes from here and what direction it will take. How will future events be funded? How often should the group meetup? What location? A recommendation from myself would be to have general presentations and talks in the morning, then in the afternoon split into different streams for beginners / users / developers.

At the end of the meet-up (and a few geo-beers in the pub) there was definitely a sense that everybody got something out of the event and would like to attend more meetups in the future.

A special mention of thanks needs to go out to Ross McDonald - @mixedbredie (Angus Council) for his efforts to organise the event and additionally thinkWhere (formally Forth Valley GIS) for sponsoring the event.

Links and seful things

Posted in Cartography, GIS, Open Source, OSGeo | Tagged , , | Leave a comment

The search for Flight 370

flight370

courtesy of Wayan Vota (https://www.flickr.com/photos/dcmetroblogger/)

As the search for missing Malaysian Airways Flight 370 approaches it’s 5th week, the reliance of Geospatial technology and the skills to analyse large volumes of data are becoming increasingly clear. In this post we will look at some of the geospatial technology and techniques that have been used in the search for Flight 370.

Background

Flight-370 disappeared on the 8th of March 2014 having left Kuala Lumpur en-route for Beijing. There was simply no trace of it. Communications were lost somewhere over the Gulf of Thailand. Speculation quickly rose as to the fate of the aircraft with hijack and rouge pilots being muted as possible explanations.  A catastrophic break-up of the aircraft through an explosion was not ruled out but looked unlikely as this would generally be noticed. Furthermore, there was no sign of debris in the area of Flight 370 last known position.

Data feeds and extrapolation

After a few days, data started turning up that suggested that the plane had stayed aloft for several hours after all communication was lost.  Equipment onboard transmits information such as status updates and diagnostics.  The engineering teams can then monitor the health and performance of components while they are in use.

The engines had sent burst of data every hour and these had been picked up by a satellite operated by Inmarsat. By monitoring the Doppler effect in the received data, Inmarsat was able to chart 2 possible paths; one to the north and the other to the south.  This had never been done before and the innovative use of this data by Inmarsat allowed the rescue effort to be concentrated in 2 distinct areas.

After a bit of tweaking and refining, the Inmarsat scientists were able to discount the Northern corridor and the search re-focused on the Southern corridor, a vast expanse of ocean west of Australia with no suitable landing site.  How they achieved this was really quite clever. They used “truthing data” from other aircraft to monitor the Doppler effect and therefore refine their estimates for Flight 370. They then calculated the speed and altitude of the aircraft and were able to work out roughly where it would have run out of fuel and ditched into the ocean.  This greatly reduced the search area.

Satellite analysis

The search area had been focused to a small section of ocean (ok, so small in this case means the size of Western Europe, but given the size of the Southern Indian Ocean this can be considered to be small).  It was now feasible to start analysing aerial imagery to try and identify debris (given that there was nowhere for the plane to land, on the 24th March Malaysian officials announced that it was beyond reasonable doubt that the plane was lost after ditching in the Southern Indian Ocean). Trawling around to find out what satellites were used was harder than i thought it would be.  Below is a summary of what i found:

  • GAOFEN-1 – a high-resolution optical sensor run by CNSA (Chinese National Space Administration) which was launched in April 2013. Gaofen-1 is equipped with a 2 metre resolution CCD (Charge-Coupled Device), an 8 metre resolution multi-spectral scanner and 16 meter resolution wide-field multi-spectral imager. It is difficult to tell which sensor produced the image below, but from the resolution it looks like it was the 8m res multi-spectral scanner.
chinese satellite

Chinese satellite image of possible debris – Pic from The Guardian/Reuters

  • A French satellite operated by Airbus Defense and Space spotted 122 objects in a cluster. The objects were up to 23m in length and in a cluster. (image released by MOSTI). Airbus Defense and space have a host of satellites run through their Astrium including EnviSAT, CryoSAT, Copernicus, ELISA and Helios 2.
French

Airbus Defence Image

  • Australian AP-3C Orion – Orion aircraft were deployed to likely search areas and scanned the area.  It is likely that the crew were using a combination of electronic surveillance system and just their eyes. This might seem like old-school, but it is an effective method of verification as trained operators can discount or confirm sightings from remote sensing. The aircraft has a long-range and can fly low making it ideal for searching.

Ocean Currents

Why has it taken so long to refine the search area?  Well there are lots of satellites, but only a few of them would have had suitable sensors on-board. Data is collected and beamed back to a receiving centre. The raw data will most probably have to be processed before it can be used for anything.  This takes time.  The search area may well have been narrowed to a chunk of the southern Indian Ocean, but this still represents a huge area, not dissimilar to the size of Western Europe.  Processing and analysing data for such a large area is not easy and will rely on a degree of automation followed by humba verification.

The southern Ocean is a wild place with frequent storms. We can see from above the at optical sensors have been used and these will be unable to penetrate cloud cover. Scientists would have to wait for the satellite to pass over the same area to try and get a better, cloud-free image. The repeat cycle may be anything from 1 day to 10 days or more.

Then you add in the ocean currents.  Anything object floating in the ocean will not be static and could drift by 10′s of kilometres a day. Given that the plane is likely to have crashed 15 days previously, debris could be 100′s of kilometers from the crash site. That is, if it has not already broken up and sunk.  But we can at least model the ocean currents and estimate the potential dispersal of the debris.  The NY Times have some excellent visualisations of both the currents and the wave heights in the southern Indian Ocean during March.  These have been produced by the National Oceanic and Atmospheric Administration and the National Centers for Environmental Prediction through remote sensing data, in-situ data (buoys) and models.  While never 100% accurate, they provide an indication and convey the uncertainty involved in determining a search area.

Locating flight recorders

Once a search area has been identified, the searchers are able to deploy listening devices which locate “pings” emitted by Flight 370′s black box. This is achieved by towing a listening device (TLP-25) back and forth across a wide area.  Pings should be received periodically and the position and strength of these should triangulate the position of the black box. But the sea floor is not flat in this area. It is around 4500m deep with mountains up to 2500m high.  We actually know very little about remote ocean sea beds.  We have limited data collected by ships and most representations come from spaceborne remote sensing data. These are not very accurate and may “miss” large structures (1-2km high) such as seamounts. There is nice overview of ocean mapping on the BBC website.

The difficulties of retrieving debris from deep, remote oceans was highlighted by the search for French Airlines flight 447.  In this case, both black box transmitters failed.

A Chinese ship detected a ping on the 5th April and a day later an Australian ship detected a ping.  But the pings were quite far apart.  The Australian ships detection seemed more consistent and stronger and this was backed up by more detections in the same area on the 8th. It is a slow process, but each detection should help reduce the uncertainty.  The question is, will the batteries in the transponders last much longer?  They are already at the limit of what is expected so time is running out.

Remote Sensing Critical

It is clear that remote sensing technology has been critical in every stage of the search for Flight 370.  It will continue to be so until the plane is found.  It has been used effectively to narrow search areas and discount blind alleys. It is also interesting to note how associated data has been used in ways that it was not intended to locate the plane and praise should be given to the Inmarsat scientists who came up with a novel solution when data and information was scarce.

Articles:

  • The search for Malaysian Airlines Flight 370 – a great article in the New York Times that focuses on the remote sensing data that is being used now that search teams have identified a “likely” crash site in the Southern Indian Ocean.
  • Wikipedia – a growing resource for information about Flight 370
  • Wikipedia - French Airways flight 447
  • NY Times – nice collection of visualisations of ocean conditions in the search area
Posted in Remote Sensing | Tagged , , , , , , , , | Leave a comment