GoGeo Service to end on 31 July 2016

In case you haven’t seen the notice on our home page (www.gogeo.ac.uk) the GoGeo service, funded by Jisc, will end on 31 July 2016.

GoGeo Home Page

Posted in EDINA, JISC | Leave a comment

IASSIST 2015 41st Annual Conference



Minneapolis, MN, USA, 2 to 5 June 2015
Host institution: Minnesota Population Center at the University of Minnesota


The theme of the 2015 conference was Bridging the Data Divide: Data in the International Context with many of the sessions dedicated to research data management in academia, which of course is being embraced across a growing number of UK academic institutions. I seem to recall that about 20 percent of UK academic institutions have a research data management strategy in place, so these sessions were of considerable interest, and well attended.

Data Infrastructure and Applications sessions were also prominent at the conference, with some interesting presentations relevant to EDINA, and attendance quite good, especially for the Block 5, E1 session for Geospatial and Qualitative Data on Thursday, June 4, 13:30 to 15:30. My presentation on GoGeo was slotted into this session along with three others with those focussed more on qualitative data.  http://iassist2015.pop.umn.edu/program/block5#a1

Plenary Sessions

The first plenary session was interesting as Professor Steven Ruggles, from the Minnesota Population Center, provided an overview of the history of the US Census and how it was at the forefront with regards to data capture, process and dissemination. The second plenary speaker, Curtiss Cobb, from Facebook, tried to make the make the case that Facebook serves as a force of social good in the world, and Andrew Johnson, from the City of Minneapolis, spoke at the final plenary session on Friday with an overview of the City’s open data policy.

Summaries of relevant presentations

3 June, Wednesday morning session:
A3: Enabling Public Use of Public Data

Mark Mitchell, from the Urban Big Data Centre (UBDC) at the University of Glasgow provided an interesting presentation titled And Data for all. The UBDC takes the Glasgow City Council’s urban open data that it has created, and makes it available to the public and to academia through its UBDC Data Portal (http://ubdc.gla.ac.uk/), which currently holds 934 datasets, primarily from the Glasgow City Council and Greater London Authority. MM noted the use of CKAN to build their data portal, and use R and QGIS  at UBDC. He also noted that there are about 300+ data portal users and try to provide good metadata records and crosslink these with their datasets.

MM noted that there was a considerable degree of metadata quality, but indicated that the Glasgow City Council planned to mandate a minimum standard for metadata quality.

Some issues were revealed, most notably differences in projections between datasets where Transport Planning used British National Grid and Health Services used northing-easting.

He also pointed out an interesting result in a survey conducted in Glasgow which revealed support for the use of personal data for societal benefit, but not for commercial interest.

He touched on the ESRC-funded Integrated Multimedia City data (iMCD) project, which is intended to capture urban life through surveys, sensors and multimedia.

Then on that same strand, he made reference to the gamification of data, which would incorporate Minecraft server and Minecraft, an interactive block game, to introduce Glasgow open data to Glasgow primary school children to make geography and maps more engaging and interesting.

More about this can be found on the UBDC website via this link.

Someone noted during questions that the Australian Bureau of Statistics has created a mobile game called Run That Town. The ABS use data from every postal area in Australia and incorporate it into this mobile game.

Run That Town gives each player the ability to nominate any Australian town and take over as its virtual ruler. Players have to decide which local projects to approve and which to reject, with the real Census data of their town dictating how their population reacts. To win, players need to maintain their popularity, making Census data central to the gameplay and giving players the chance to use the data themselves.

Mark also mentioned about collaborative efforts between UBDC and the Glasgow School of Art to create noise and light maps for the City of Glasgow, then noted that housing charities were requesting more data from the Glasgow City Council as well.

Winny Akullo, from the Uganda Bureau of Statistics, delivered another presentation of this session, which provided an overview of the results of a quantitative study in Uganda that was carried out to investigate ways of improving the dissemination of statistical information there. The results indicated that the challenge remained, and one that required more resources to improve data dissemination.

Margherita Ceraolo, from the UK Data Service, wrapped up the session with her presentation about the global momentum towards promoting open data including support from national governments and IGOs (e.g. IMF, World Bank and UN).

She made reference to macro data as well as boundary data, then made a reference to the UKDS building an open API for data re-use; release is scheduled for the end of 2015. She also made a reference to a map visualisation interface to display all data in their collection.

3 June, Wednesday afternoon session:
B5: Building on Common Ground: Integrating Principles, Practices, and Programs to support Research Data Management

Lizzy Rolando (Georgia Tech Library), and Kelly Chatain, from the Institute for Social Research (ISR) at the University of Michigan, gave interesting presentations on support for research data management at their respective institutions. Session Chair, Bethany Anderson, from University Archives at the University of Illinois-Urbana, also discussed ways of integrating the work of academic archives and research data services to appraise, manage and steward data.

Some key points that they noted during their presentations included the following:

  • requiring a chain of custody for data to encourage collective ownership and responsibility;
  • make data use a higher priority over preservation; and
  • mentioned Purdue University’s policy for data retention which requires a reappraisal of data every 10 years.

These are eminently sensible approaches to data management in academia. Granted, the first one faces resistance, but if data creators and users refuse to be accountable for data, then who assumes this responsibility? Ownership needs to be addressed if data are to be managed and shared, and when it becomes a collective responsibility, then perhaps there might be more willingness as a shared activity?

Data re-use ought to be prioritised as well, and periodically assessed rather than stored on various media to be forgotten. It’s become another of many classic excuses when terabytes of data are blamed for eschewing the responsibilities of data documentation/metadata creation.

It’s uncertain, but how many spatial datasets are worth a place in archival storage? If there are spatial datasets of no value, then they should be deleted rather than saved. Question is who makes these decisions, but could assume that it would be within each department?

3 June, Wednesday afternoon late session:
C5: No Tools, No Standard — Software from the DDI Community

Listened to a presentation about the Ontario Data Documentation, Extraction Service and Infrastructure (ODESI) and the Canadian Data Liberation Initiative (DLI), with reference to Nesstar. Nesstar is a software system for data publishing and online analysis. The Norwegian Social Science Data Services (NSD) owns it and recall it during the time I worked years ago at the UK Data Archive.

4 June, Thursday morning session:
D4: Minnesota Population Data Center (MPC) Data Infrastructure: Integration and Access

This session provided an overview of the Minnesota Population Center (MPC) project activities with most of the presentation about Integrated Public Use Microdata Series (IPUMS) (www.ipums.org), which is dedicated to collecting and distributing free  and accessible census data, both US and international census data.

Interesting to note from the presentation, the number of users, with economists, the highest, at 31 percent; demographers and sociologists accounting for 16 percent; and journalists and government users at 15 percent. Only 8 percent of users were identified as geographers/GIS, though they indicated that their numbers were growing.

The North Atlantic Population Project (NAPP) was mentioned, which includes 19th and early 20th century census microdata from Canada, Great Britain, Germany, Iceland, Norway, Sweden, and the US, so worth noting that British census data available as well.

The Terra Populus project (http://www.terrapop.org/) was also covered and sounded quite interesting. The goal of the project is to integrate the world’s population (census) with environmental data (remotely-sensed land cover, land cover records and climate data).

There is also a temporal aspect to this which exams interactions over time between humans and environment to observe changes that take place between the two.

There is a TerraPop Data Finder being built, which is currently in beta. It holds census data, and land use, land cover and climate data.

The MPC has also been involved with the State Health Access Data Assistance Center (SHADAC) Data Center, doing analysis on estimates of health insurance coverage, health care use, access and affordability using data from the 2012 National Health Interview Survey (NHIS).

4 June, Thursday afternoon session:
E1: Geospatial and Qualitative Data

There was exceptionally good attendance for this session with most of the room filled. Amber Leahey, the Data Services Metadata Librarian at the University of Toronto, was Chair of our session. Had a chance to talk to her after the session, and learned about the Scholars GeoPortal, which is an online resource for Canadian academics and students to access licensed geospatial datasets through a subscription service, much like Digimap.  Impressive portal, and data are free, though the portal provides a limited number of Canadian datasets. They encourage data creators to upload their datasets to the portal, much like Digimap ShareGeo, but face similar challenges as here.  http://geo1.scholarsportal.info/

Andy Rutkowski (USC) started the session with his presentation on using qualitative data (social media, tweets, interviews, archived newspaper classifieds, photographs) to improve the understanding of quantitative data to produce more meaningful maps, maps as social objects, a move towards spatial humanities?

He alluded to skateboarders’ information about pavement conditions at various locations in Los Angeles that led to a new skateboard park.

He also referred to Professor Nazgol Bagheri’s (UT San Antonio) work on mapping women’s socio-spatial behaviours in Tehran’s public spaces using photographs and narratives linked to GIS data from the Iranian Census, national GIS database and City of Tehran; all this to generate a qualitative GIS map that displays the gendering of spatial boundaries.

He concluded with a reference to the LA Times Mapping project, which started in 2009 and displays the neighbourhoods of Los Angeles, which have been redrawn using feedback from readers whose perceptions of boundaries differed from the original ones.  http://maps.latimes.com/neighborhoods/

The next presentation (The Landscape of Geospatial Research: A Content Analysis of Recently Published Articles) was a joint collaboration with library staff at the University of Michigan reporting on their efforts and results to use geospatial research methods to capture information from the body of published literature. Samples of articles, from a selection of multi-disciplinary journals with spatial themes, were UID coded for content including spatial data cited, software used and research methodology; I assume with regards to software, this would be ArcGIS, ERDAS MapInfo, etc?

Metadata was also compiled for the articles, which included title, subject, author(s) subject affiliation, number of authors and their gender; this information extracted through multi-coding. Also reference to geo co-ordinate analysis and building the schema to support this information extraction.

Certainly the Unlock geo-parser (http://edina.ac.uk/unlock/) comes to mind as being relevant to their project. We’ve already discussed the possibility of doing something similar using GoGeo to extract and harvest metadata from open access journals as publications represent the best sources for spatial data information with most publications peer-reviewed, and the data cited, so this should address data quality concerns, and the purpose for which the data were created. Each publication would also provide the author(s) name(s) and contact details for those interested in acquiring the data, which might in turn pressure researchers to release their data through GoGeo rather than face personal requests for their data.

My presentation followed and can be found on this EDINA page.
GoGeo: A Jisc-funded service to promote and support spatial data management and sharing across UK academia

One of my comments, and a photo of one of my slides, reached the IASSIST conference’s Twitterland and went viral at the conference, though I noted as well that metadata creation was important, but the reality is that after 14 years of metadata coordination both in the public sector and academia, I’ve yet to meet anyone who has actually expressed any pleasure in creating metadata.

creating metadata reality

My presentation provided an overview of EDINA , Jisc and the GoGeo Spatial Data Infrastructure, then summarised the latter’s successes and shortcoming, the former attributed to GoGeo users searching for data; the latter, GoGeo users unwilling to share their data. My presentation also offered to the audience, new approaches that would encourage spatial data management and sharing including a mandatory requirement for students to use Geodoc to document data cited in their dissertation and theses as a requirement for graduation; it’s often easier for a department to impose this requirement on its students rather than its faculty, but if students document their data, future students can access the metadata records as part of their literature review, and access data that might complement their own research data; this in turn would require university departments to take ownership of their students’ data and make available to others, so at least spatial data is shared internally. This could be restricted to the department or within a university if there is a data management policy and the infrastructure in place to support it, though if not GoGeo provides this.

The use of Geodoc and the GoGeo private catalogues was also presented as another approach to supporting spatial data information management with Geodoc used at the personal level where a researcher can document his/her spatial data, then use Geodoc to store and update those records. Then the option of exporting Geodoc records to attach to shared spatial datasets, which seems the preferred option as academics will entrust their data to colleagues rather than make them openly available; the data recipient can then import the metadata record into his/her own Geodoc to access for updating and editing. The other option is for Geodoc users, whether part of a research project group, a department, or university, to publish their metadata records to a GoGeo private catalogue, which only those with assigned usernames and passwords can access. As I manage these catalogues, I can assign these to those who’ve been granted permission to access the metadata records, and can be affiliated with the same project, but from different universities.

The hopeful outcome would be that after these records and their datasets have served their purpose, then the records would be published in GoGeo’s open catalogue and the data uploaded to ShareGeo, or a GoGeo database as it would be better to have both the metadata and data in the GoGeo portal and not separate as it the case now between GoGeo and the ShareGeo data repository, which records from 500 to 3,000 downloads a month, so better to redirect those users to GoGeo.

My presentation noted as well the Jisc commitment to providing the resources to the UK academic community in support of research data management, then noted that about 20 percent of the UK universities have a data research management policy in place.

Also in line with the Landscape of Geospatial Research: A Content Analysis of Recently Published Articles presentation, the search interface in GoGeo could be updated to search and harvest metadata from peer-reviewed open access journal publications. It would also be an important step forwards if publishers would require authors to release their data, but there seems to be no movement on that front as it is in the financial interest of most publishers to publish more, and might see this as an imposition on researchers which would result in fewer publications?

If there was any consolation, there were other presentations at IASSIST that revealed similar experiences (see 5 June, Friday morning session), so academia represents a formidable challenge both here and the US, and probably in most other countries as well?

Mandy Swygart-Hobaugh (Georgia State University) concluded the session with her presentation on qualitative research. She asked if social sciences data services librarians devoted their primary attention to quantitative researchers to the detriment of qualitative researchers, and her survey indicated that it is overwhelmingly biased towards quantitative data researchers.

5 June, Friday morning session:
F5: Using data management plans as a research tool for improving data services in academic libraries

 Amanda Whitmire (Oregon State University), Lizzy Rolando, Georgia Tech Library and Brian Westra and University of Oregon Libraries combined to offer interesting presentations.

AW talked about the DART Project (Data management plans as A Research Tool). This NSF-funded project is intended to facilitate a multi-university study to develop an analytic rubric to standardise the review of faculty data management plans for Oregon State University, the University of Michigan, the Georgia Institute of Technology and Penn State University.

This poster offers more insight about the Dart project.

She also talked about the Data Management Plan (DMP) tool, which can be used to provide a rich source of information about researchers and their research data management (RDM) knowledge, capabilities and practices. She revealed some information including the possibility of plagiarism with 40 percent of researchers sharing text and geographical research comprising only 8 percent of the RDM activities, so probably no different than here in the UK as the social sciences/geosciences seem more averse to data management and sharing. Only 10 percent of the researchers approached the RDM staff for assistance as well.

The DMP tool also has the functionality to see cross-disciplinary trends without engaging with the researchers, and with only 10 percent of the researchers approaching the RDM staff, this is probably good. She noted that the cross-disciplinary trends were high for the likes of Mathematics and Physics and low for geography, and really no surprise in this revelation.

Further assessment of information revealed that with eight research plans/practices(?) did not indicate any intent of releasing data; five plans indicated a selective release of ‘relevant data, which she interpreted as suggesting it was to the researchers’ discretion and just another way of saying ‘no’ to data sharing.

In addition, she reported that researchers’ descriptions of data types was done well, but no mention of metadata creation or data protection and data archiving; some mention of data re-use.

Lizzy Rolando revealed similar results during her presentation which involved feedback from researchers at Georgia Tech.

Asked about their plans on how they would share their data, researchers indicated the following:

– Citation in journals: 22 percent
– Conferences: 10 percent
– Repository: 9 percent
– Other repository: 7 percent

In effect, most researchers perceived that the citation of their data in journals or at conferences was effectively data sharing; only a minority seemed inclined to share their data directly.

Also, results of the survey indicated that researchers weren’t aware of metadata standards, or metadata at all, and expressed a willingness to share their data, but not willing to archive their data, again, their interpretation of data sharing seems to suggest only through citation.

LR suggested that one way to encourage researchers to create metadata was to do so informally through note taking, but then would researchers be willing to share their notes is the question I have, or would they allow librarians or others to reference their notes to create metadata?

I’ve offered my services to academics in academia, but no one has accepted the offer of providing their data for me to extract information to document their datasets, and this is a step further than asking researchers to take notes about their data.

It’s a good idea, can it succeed, though it should be a reasonable approach to data management, but without any formal structure, what will happen to the notes? Will those files be stored randomly in various media, accidentally deleted, or not properly updated to reflect changes made to the dataset?

Brian Westa from the University of Oregon, offered another summary of a similar survey conducted at his university; the survey targeted researchers in Chemistry,  Biological Sciences and Mathematics.

Asked about data documentation/description and metadata standards, 51 researchers in Biological Sciences and Chemistry acknowledged the following:

– Data description: 14
– Could identify metadata standards: 10
– Making data public: 14
– Mentioned data formats: 12

The Dryad repository was mentioned amongst the 14 who responded to making data public, but again, with only 10 respondents acknowledging familiarity with metadata standards, there are RDM issues here as well.

Feedback also indicated that most researchers were concerned about trusting others with their data, and though there were 14 respondents who acknowledged that they shared their data, most indicated that they shared their data through citation in publications and their own website, so again, a reluctance to physically share their data, and if they did actually share the data, it can be inferred that it would have been one-to-one with colleagues they could trust?

Turning to the survey for researchers in Chemistry, much the same was suggested in the results. A majority indicated that they shared their data through citations in publications and only shared data through ‘specific requests’, again trust comes into play here and assume these requests would be approved if from a close, or trusted colleague?

The respondents noted the following as methods of data sharing in this order:

– Publications
– On request
– Personal website
– Data centre
– Repository
– Conferences

None of the respondents made any reference to metadata or standards.

BW concluded with an overview of the National Science Foundation’s (NSF) effort to encourage research data management and sharing, which basically requires the research community, who receives considerable NSF funding, to establish data management practices; however, BW noted that it’s not happening, though said that there was one occasion recently where continued funding for a postgrad student was withheld until the student had submitted an RDM plan to the NSF, so there has been little progress there, even from a major funding body as the NSF, and this sounds similar to experiences at NERC where researchers saw funding as a one-off, so felt no obligation to submit their data to NERC after the project was finished, though I think they were to review this and try to find another strategy that would encourage better data management and sharing.

The resistance within academia to both data management and sharing is quite concerning as access to the data should be part of the peer-review process. In this Reuters’ article, and others, it’s noted that there are publications where the data don’t hold up to scrutiny, and this is an alarming concern.

As governments continue to cut funding for research, this makes it increasingly more difficult for researchers to collect sufficient data for proper analysis, and less inclined to share their data, so will this only exacerbate the problem, or are there other issues as well, but certainly trust seems to be a key concern amongst researchers, and these presentations at the IASSIST conference reaffirm the reality here, and this reluctance to share data, and even data management seems to be too much to ask of most researchers to do. Metadata creation is so far removed from the actual data processing and analysis, and the publication of these results, hence, most researchers who would rather spend more time with their datasets than their descriptions, especially as most researchers have no intention of sharing their datasets publically, and only share it with those they trust; however, rather than taking questions about their datasets with each request, the Geodoc metadata editor tool would allow each researcher to document his/her datasets and bundle the corresponding metadata records with them to share both with their trusted colleagues.

Perhaps, over time, researchers will be willing to share both their metadata and data with the public, but that time still seems far in the future, but for now, the support must be made available to those who want to manage their data and share it with those that they can trust.

5 June, Friday afternoon 

I had planned to attend the G2 session on Planning Research Data Management Services, but had the fortunate opportunity to speak with Professor Bob Downs from Columbia University. GoGeo harvests metadata from the Socioeconomic Data and Applications Center’s (SEDAC) portal catalogue, which CIESIN hosts at Columbia University, so Professor Downs had asked me about this during question time after my presentation on Thursday.

We discussed both SEDAC and GoGeo, then he mentioned to me how DataCite was useful source for locating catalogues to harvest metadata, with SEDAC’s catalogue included on the website. He’d mentioned as well about tracing the use of SEDAC data in publications through citation, which was quite impressive as the number of times was more than 1,000, so clearly demonstrating the benefit of making their data open access, and the success of the SEDAC portal.

That was IASSIST 2015 in Minneapolis, Minnesota. The 2016 conference will be held in Bergen, Norway.






Posted in Conference, GIS | Leave a comment

NLS publishes new 1840s-1950s Ordnance Survey detailed maps for London and South-East England

National Library of ScotlandThe National Library of Scotland has just made freely available online 16,865 historic Ordnance Survey maps covering Greater London and the south-east of England. The Ordnance Survey 25 inch to the mile (1:2,500) maps were the most detailed series covering both urban and rural areas in the 19th century, and date between the 1840s and the 1950s. The maps are immensely valuable for local history, allowing practically every feature in the landscape to be shown. They provide good detail of all buildings, streets, railways, industrial premises, parkland, farms, woodland, and rivers.

At present, coverage is of Berkshire, Buckinghamshire, Essex, Hampshire, Hertfordshire, Kent, London, Middlesex, Surrey and Sussex. The scanning of this series is underway and will expand geographically over the next couple of years.

NLS has also georeferenced a layer of these maps dating between the 1890s-1920s, allowing them to be compared directly to modern day maps and satellite images with a transparency slider, as well as compared side by side on screen.

NLS OS 25 inch London side by side

Posted in Historic | Tagged , | Leave a comment

Call for Papers “GeoCom 2015: Resilient Futures”: closing date extended to Friday, 12 June 2015

The Association for Geographic Information is seeking papers for their flagship annual conference. GeoCom is a key event in the UK geo calendar, with representation from across thAGIe GeoCommunity (hence the name!) including commercial, public (central and local government) and the 3rd sector.

The theme of this year’s conference is Resilient Futures, and brings together all the topics examined during our GeoBig5 events (http://www.agi.org.uk/events/geo-the-big-five) through the year:

Smart Energy (past): securing the future energy sources to meet the growing demand, this event showed the significant role of location in connecting demand to supply.

BIM: The Next Level (past): probably one of the most significant events in the geospatial arena of recent years, BIM is more than a 3D model, but an entire process. A process that is entirely dependent on location based data.

Sensors & Mobile (past): examined the impact of an ever increasing capability to capture and ‘sense’ location based data.

Future Cities: Security (Thursday, 9 July): the role that geospatial information has to play in preparing for future shocks and stresses

Big Data & You (Thursday, 8 October): examining the ethics of big data, privacy and the special role that location plays in the debate.

The AGI is keen to hear from thought leaders in all these areas and wants to encourage our members to submit an abstract based on your work in these sectors. But this is not just for the ‘usual candidates’! There is a strong development aspect to the conference with dedicated spaces for those who are at early stage in their career (with support from our Early Career Network team (http://www.agi.org.uk/news/agi/721-successful-first-ecn-webinar).

Priority will be given to papers that explore themes around Resilience and the Big5 topics, however papers on any aspects of Geographic Information and Research are encouraged. In particular to our Technology stream.

It’s a very simple process: abstracts of up to 350 words (max) should be submitted before the 30th May 2015 (being extended to Friday, 12 June 2015) via our online form:


For further information about this and other AGI events please see our website:



If you have any questions, please contact the AGI via email ( info at agi.org.uk )

Posted in Uncategorized | Leave a comment

ESRI UK Annual Conference

ESRI UK Annual Conference logoGoGeo attended the ESRI UK held annual conference in London on 19th May. The event was well attended with around 2500 delegates, making it the biggest GIS event in the UK.

The morning plenary was kicked off by Stuart Bonthrone, ESRI UK’s Managing Director, who gave an overview of the current challenges facing the world and how GIS could be used to help monitor and manage these changes. Stuart was followed by a representative from the Port of Rotterdam who used the ESRI platform to integrate and manage their spatial data in order to improve efficiency in a confined area where physical expansion of the port is no longer possible. One of the Port of Rotterdam’s key requirements was for the final system to be simple to use, with users able to find the information they require within three clicks. To prove they had achieved this they invited a group of school children in to test the final software! An inspiring talk by Walking With The Wounded followed. Their next challenge is a Walk of Britain which will cover around 1000 miles over a period of 6-8 weeks, which will be assisted by mapping services from ESRI.

The final sections of the plenary were by Charles Kennelly and by the Technical Research Team lead by Sarah Lewin.  Charles gave a detailed overview of the ArcGIS platform and explained some of the future plans, including how support for ArcMap will continue ‘as long as it is needed’, it won’t simply  be turned off since the release of ArcGIS Pro earlier this year. Sarah and the team gave a great demonstration of using the 3D visualisation and analysis capabilities of ArcGIS Pro and the Javascript library in indoor tracking applications.

ESRI UK Annual Conference Higher Education Track

Photo credit: @Addy_Pope ESRI UK

After the plenary, GoGeo attended the Higher Education track. The track was well attended, with some talks standing room only. A couple of the talks were more technical and may have been better suited to the GISRUK audience, but on the whole they were pitched about right and were well received. More than one speaker highlighted the wish to embed GIS in undergraduate teaching, not just in Geographic disciplines but in other subject areas where GIS could be of real benefit. Given the positive pro-GIS atmosphere around the conference it was surprising hear that Newcastle University, the only University in the UK to offer an undergraduate degree in GIS, are struggling to attract students.

In the closing plenary ESRI showcased some of their interesting R&D work. It’s good to see such a major player in the GIS world not resting on their laurels and continuing to develop the technology in exciting and innovative ways.

The ESRI Annual Conference has grown and grown over the years and this year there were nine parallel tracks meaning it was sometimes difficult to decide what to attend. With this in mind it may be useful if future events are held over two days with some repetition to allow attendees to catch more sessions.

Update 9th June 2015: Since writing this blog post, ESRI have published all the presentations from the day. If you were unable to attend the event or missed some of the sessions because they conflicted with others then you can now catch up online. The opening and closing plenary sessions are available to stream as videos, the slides for the other sessions are all available to view at the URL below:


Posted in Conference, GIS | Tagged | Leave a comment

GoGeo Mobile has been released

The GoGeo Mobile iPhone App was created bgogeoAppy EDINA at the University of Edinburgh to support teaching, learning and research.

Jisc provided support for the GoGeo App project as part of its commitment to encourage the use of new and emerging technology to support research and learning in the UK.

GoGeo Mobile is an app that allows users to keep abreast of news and events in the geospatial sector. GoGeo Mobile is separated into a number of channels including News, Events, Jobs and Resources for Teachers. Each channel contains useful and relevant resources for anyone working with Geographic Information Systems (GIS), Remote Sensing or spatial data.

In addition, GoGeo Mobile allows users to perform targeted searches for spatial data. Searches can be defined by keyword and/or location and return a brief description of the data and users can then forward themselves a direct URL to the metadata record so they can download the data when they are back at their desk.

Compatibility: Requires iOS 7.0 or later. Compatible with iPhone, iPad, and iPod touch. This app is optimised for iPhone 5, iPhone 6, and iPhone 6 Plus.

You can download the GoGeo Mobile App from the UK iTunes App Store.

Please provide feedback to edina@ed.ac.uk with GoGeo App entered in the subject field of your email.

Posted in EDINA, GIS, JISC, Mobile | Leave a comment

Ordnance Survey to become a GovCo at the end of the financial year


Matthew Hancock MP has just posted this statement regarding the status of the OS. 

I am today announcing the Government’s intention to change Ordnance Survey from a Trading Fund to a Government Company at the end of the financial year.
The change is operational in nature, and is aimed at improving Ordnance Survey’s day-to-day efficiency and performance. It will provide the organisation with a more appropriate platform from which to operate, and one which provides greater individual and collective responsibility for performance.
Ordnance Survey will remain under 100% public ownership with the data remaining Crown property, with ultimate accountability for the organisation staying with the Department for Business, Innovation and Skills.

Further to this change, in the coming weeks I will also be setting out more details on how Ordnance Survey will be building on its existing extensive support for the Government’s Open Data policy and on some senior appointments which will further strengthen the management team.

Ordnance Survey exists in a fast moving and developing global market. There has been rapid technology change in the capture and provision of mapping data, and increasingly sophisticated demands from customers who require data and associated services – including from government. To operate effectively, Ordnance Survey needs to function in an increasingly agile and flexible manner to continue to provide the high level of data provision and services to all customers in the UK and abroad, in a cost effective way, open and free where possible. Company status will provide that.

Mapping data and services are critical in underpinning many business and public sector functions as well as being increasingly used by individuals in new technology. Ordnance Survey sits at the heart of the UK’s geospatial sector. Under the new model, the quality, integrity and open availability of data will be fully maintained, and in future, improved. Existing customers, partners and suppliers will benefit from working with an improved organisation more aligned to their commercial, technological and business needs.
The relationship with Government will be articulated through the Shareholder Framework Agreement alongside the Company Articles of Association. The change will be subject to final Ministerial approval of these governance matters.

Ordnance Survey will also continue to publish a statement of its public task, to subscribe to the Information Fair Trader Scheme and comply with the relevant Public Sector Information Regulations, including Freedom of Information legislation, and make as much data as possible openly available to a wide audience of users.

The statement can be found here.


Posted in Uncategorized | Leave a comment

How can Public Data Group data be made more accessible and useful?

This survey invitation just came across twitterland, so dropping it into GoGeo blogland. This is certainly important to monitor as it refers to the Ordnance Survey, and these other public sector bodies as well.


“The Public Data Group (PDG) brings together four public sector bodies – Companies House, Land Registry, Met Office and Ordnance Survey – that collect, refine, manage and distribute data on the nation’s companies, property, weather and geography.

The Public Data Group’s data is made available through a variety of channels and licenses and includes both commercial agreements and the provision of Open Data.

The value of the data that is charged for is vast – with Ordnance Survey data widely used in the insurance sector, and the billions of pounds saved by the use of Met Office data in the aviation industry as just two examples. Equally, the value of the Open Data released by the Public Data Group is very significant and growing. The most recent estimate placed the value of Open Data released by PDG at over £900m annually.

Data is of increasing importance to the economy, driving innovation and opening a range of new possibilities for businesses.

Although the PDG organisations have a commitment to make as much data freely available as possible, they have to balance this commitment with other requirements such as maintaining the quality of the data, covering the costs of the collection and distribution of the data, and avoiding cross subsidising one data set from another. Companies House has recently committed to making its digital information available free of charge in 2015 but for Land Registry, Met Office and Ordnance Survey some data is charged for.

How you can help

We are keen that the charges for public data do not act as a barrier for those just starting their business or developing their product. There are already some ‘Developer Licenses’ available that allow usage of charged for PDG data for free under certain criteria and the intention is to enhance and expand these further. We are also keen to understand if PDG data is widely known of and if users find it convenient to utilise so that future development work and publicity can be better targeted.

The purpose of this survey is therefore to seek your views on:

  1. Your awareness of the PDG data that is available;
  2. Any issues you face using PDG data; and
  3. How ‘Developer Licenses’ should be designed to most meet user needs.

More can be found here, including access to the seven page online survey.”

Posted in Uncategorized | Leave a comment

Jisc’s call for research data management ideas (Research Data Spring). Please cast your vote for the Cloud Work Bench proposal


As part of their effort to create new solutions to common research problems, Jisc are looking for ideas from individuals and groups with an interest in research data. Please submit your ideas to promote solutions, and offer fresh perspectives for facilitating research data management. Everyone is also invited to vote for their favourite idea, or against other ideas! A simple registration is required in order to participate.

In particular, Research Data Spring is interested in ideas that make it easier to manage research data, especially from the researchers’ perspective (in addition to protocols mentioned within the first theme); in this context, it includes the re-use of data. In other words, Research Data Spring is seeking ideas that will smooth the processes of data management, deposit and re-use within the research lifecycle. This area is closely related with “data creation, deposit and re-use”, but the two are split in order to emphasize that some ideas might be focusing on generic data management support and related protocols and solutions for deposit and re-use, while others would address key disciplinary and cross-disciplinary research aspects.

As of today, the following 25 ideas have been submitted for voters’ considerations:

  • Streamlining Deposit: An OJS to Repository Plugin
  • Badges as a proxy for peer review of data
  • Standards and Schemas for Digital Research Notebooks
  • The Lab Box: Solve local backup, work towards rich metadata
  • Exchanging experience on RDM integration and interoperability
  • Research Data Infrastructure for the Visual Arts (RDIVA)
  • Provenance and Packaging
  • Standard protocol for research equipment
  • A metadata standard to enable automated genealogy generation
  • Mock idea: note that title is limited to 68 characters
  • Integrated RDM toolkit/service
  • Data browsing tools for repositories
  • Collaboration tool for qualitative data analysis
  • One page micro repositories
  • Symplectic for RDM purposes
  • DAF Question Bank
  • BOOKISH: Infrastructure Sharing for the NLS
  • Workshops/Training on Stakeholder Support of Researchers
  • Data retrieval via persistent identifiers (DOIs)
  • Exporting from DMPonline to data journals
  • Linked data notebook
  • Use semantic desktop to capture contextual research data
  • Streamline repository submissions from Zotero profiles
  • Research Data requirements vocabulary
  • Cloud Work Bench

The one idea submitted that is relevant to the geo-community comes from EDINA at the University of Edinburgh, and below is a summary of the proposal. If you find it an idea worth supporting, please visit the Research Data Spring website and cast your vote.

Cloud Work Bench

The concept of Cloud Work Bench (CWB) is quite simple – to provide researchers in the geospatial domain (GI Scientists, geomaticians, GIS experts, spatial disciplines) the tools, storage and data persistence they require to conduct research without the need to manage the same in a local context that can be fraught with socio-technical barriers that impede the actual research. By streamlining the availability and deployment of open source software tools, by supporting auto-generated web services and using open data, the work bench concept is geared towards removing the barriers that are inherent in geospatial research workflows – how to deploy the tools you want and have the storage and data management capabilities without the overhead of doing it all yourself. Think of it as an academic Dropbox with additional geospatial software tools and data thrown in…

We propose piloting the CWB approach within the geospatial research community which has a well established and broad user base across academia and industry (reflected for example via the uptake of Jisc’s flagship Digimap service), and also has a mature open source toolset and data stack which are prerequisites to conducting research e.g. Open Street Map, Ordnance Survey Open data, Postgis, Geoserver, GDAL/OGR.

We anticipate that the CWB concept will be transferable to other domain and disciplinary contexts e.g. statistics.

Posted in Uncategorized | Leave a comment

EDINA Geo Services at GeoDATA London Showcase 2014

Early this month, EDINA Geodata Services held an exhibit at the GeoDATA Showcase 2014 event in London. This was our second time to exhibit at this event which is aimed primarily at the commercial end of the GI industry covering current data and technology topics. This follows on from other events in the series as described previously on the GoGeo Blog.

A summary of the talks can be found online.

We had a small stand, but the positive responses we got from visitors was very encouraging: from students who are currently using Digimap in their studies, to the lecturer in a university who said that Digimap was a great resource and essential to his teaching. Even more encouraging was the number of delegates and staff on other stands, with successful careers in the GI industry, who came up and said that they had used Digimap during their studies and it was a vital to their degree. It’s good to know that the future generations in the GI industry have the expectation that they will have easy access to high quality geospatial data, readily available from Digimap (at least while they are in education!).

We talked to delegates from a wide range of industries including environmental consultancies, government, data providers, local councils, defence and education as well as visiting and talking to many of the other exhibitors. We got a lot of useful feedback on what we’re doing and ideas for what we could be doing in the future including potential opportunities for collaboration. Of particular interest to delegates was the Fieldtrip GB app we were demonstrating which is a mobile data collection platform – especially once the magic word ‘free’ was mentioned, and also that there is an Open version available on Github.

Mince pies and mulled wine near the end were a welcome break from a long day, so busy that we didn’t actually get a chance to attend any of the talks, many of which looked very interesting, however it was a very useful event to attend. We look forward to next year’s event on the 3rd December 2015.

Posted in Conference, Data, EDINA, GIS, Mobile, Open Source | Leave a comment