GISRUK 2014

glasgow

Glasgow – willsnewman (flickr)

Jane Drummond opened the 22nd conference and explained that Pink was the colour of the conference, hence the helpers were wearing Pink T-shirts. This also might explain the pink umbrellas last time GISRUK visited Glasgow.

Wednesday

Mike Worboys keynote gave “A Theoretician’s eye view of GIS Research”. He highlighted the dramatic fall in the proportion of GISRUK papers that covered the theoretical side of GIS. He mused that perhaps we had covered it all; in the end he highlighted several areas where there was still much theory to be discussed, including Geo-Semantics and Geo-Linguistics.

In The Urban Environment session chaired by Peter Halls we saw William Mackaness talk about Spacebook, a system of delivering directions via audio as users encountered various way points on a route. The research found that using Landmarks gave better results than street names in terms of getting someone from A to B.

Phil Bartie, who was a researcher on William Mackness’s paper delved deeper into the issue of Landmarks. He was using images to find out what people identified as landmarks and was analysing them semantically and spatially to distinguish related and unrelated features. His use of Trigrams, or groups of three words may well be a solution to issues with obtaining good search results from EDINA’s place name gazetteer.

Nick Malleson was next talking about using tweets as a proxy for ambient population. Despite the issues with the quality and bias of the Twitter data he found that it still overcame the problems of using census data for city centre population when assessing crime rate. The peaks seen in crime rate for the main shopping and socialising areas disappeared as they were adjusted for the number of people present rather than the number actually living there. Outside of these areas, crime rates were still high in areas where there were social problems as shown by using census data.

The use of Twitter in research continues to raise interesting questions about sampling validity and ethics, this would continue into the second day.

Thursday

Thursday as the only full day in this years GISRUK program and had 3 parallel sessions.

Spatial Analysis: the best 2 talks being really quite different. Georgios Maniatis discussed error quantification and constraints in environmental sensors.  Georgios’ was looking at sediment movement in rivers, using a local reference frame offered accuracy improvements but added further complications, not least that a significant portion of the signal travel time was through water. Given the small distance from transmitter to receiver, errors could quickly become significant.

The other talk that stood out looked at visualising active spaces of urban utility cyclists. This was given by Seraphim Alvanides on behalf of Godwin Yeboah. Their analysis clearly showed that in certain areas of Newcastle the cycle infrastructure was mis-aligned with where cyclists actually rode. Cyclists used more direct routes to get to work and were more likely to detour on the way home to do shopping or other leisure activities. The fact that the Newcastle Metro which is operated by Deutsche Bahn, do not allow cycles onto their trains. In Continental Europe they seem more amenable to such integration.

Citizen Survey: This session looked really interesting and Neil Harris (Newcastle Uni) kicked off with a very interesting description of a heterogeneous sensor infrastructure which used a schemaless approach.  They had effectively decided to avoid XML and used key value pairs instead.  By using HStore they were able to hook things up with Postgres/PostGIS. The advantage of this approach was that they could integrate new sensors into the D’base easily by just adding key values to the main list. Key values may be seen as old hat by many, but with HStore it gives quite a flexible solution. The work is part of the Science Central project and will effectively pulls together all possible data feeds for the  Science Central to use.

The other presentation of note was by Robin Lovelace (Leeds) who invited discussion around the merits of twitter data in research.  This was not about the ethics around whether users knew what data they were giving-up, but more about the pro’s and con’s of using the data at all.

  • Con – unregulated data, unfocused, loudest voice dominates
  • Pro – diverse, low cost, continuous, responsive

Using Twitter data may raise the following questions

  1. Who made it? – the public
  2. Who owns it? – Twitter

As the discussion progressed it was mentioned that we may be in a golden age for social data, at the moment lots of people are providing information through social media and the social media companies like twitter are allowing us to use the info for free. At some point either the public will realise what info they are providing and seek to limit it, or the government will perhaps do so, and social media companies (who trade on information about users) may restrict access to data or try to charge for it.  Interesting and thought provoking.  If you want to find out more, look at Robin’s presentation and download his code from Twitter to set up a Twitter Listener.

Remote Sensing - I used to do remote sensing so i thought i would go to this session and see what was new. It turns out that it didnt have a huge amount of remote sensing in it, but there was a couple of gems worth mentioning. First is the work that Jonny Huck (University of Lancashire) is doing with sensors.  Jonny presented Map.me at last years GISRUK and it was good to see this being used in other people’s research, but the sensor work took a different direction. Jonny made a low-cost (£400) pollution monitoring kit that also monitored VO2 flux of users. This allowed him to crudely calculate risk of pollution.  It was simple kit using motes , smart phones and some basic gis for visualisation. I found it quite refreshing to see a simple approach taking off the shelf kit and running simple experiments. This will hopefully lead to discussion, refinement and some really insightful science.

The other presentation that i enjoyed introduced Whitebox – a geospatial analysis toolkit created by John Lindsay. This is an open-source GIS package and i was stunned by how many tools it had., over 370 at the last count! Possibly most impressive was the Lidar processing tool which will happily open 16Gb of raw lidar point cloud and allow you to process it. I dont know of another open source package which handles lidar.  John likes to call Whitebox open-access rather than open-source. Whats the difference? Well when you open a module there is a “View Code” button. This will open the code that runs the module so that you can see how it works and what it does.

Whitebox is relatively unknown, but John hopes to push it more and the audience suggested using GitHub rather than google code repository and to work towards OSGeo incubation.  It does look good and i have already downloaded it. Oh, it is a Java app so is easy to get working on any platform.

Plenary – I enjoyed the sessions and found something interesting in each one, but the plenaries were a bit underwhelming. Most conferences use the plenaries to bring everyone together and then get the big cheese’s out to show-off cutting edge research or to inspire the audience. The Thursday plenary didn’t seem to do this.

Friday – i was not able to attend on friday, sorry.

gisrukOverall – the conference was well received and i found some of the talks really interesting.  I would have like to be inspired by a keynote at the plenary and I hope that GISRUK 2015 in Leeds will use the plenary to motivate the group to continue to do great GIS research. Thanks to the  local team for pulling the event together, it is never an easy task.  You even managed to get the weather sorted.

 

 

Posted in Conference, GIS | Tagged , , | Leave a comment

Digimap for Schools adds historic map layer

DFS

Old and new

Digimap for Schools has added a new historic map layer to the popular online map service, extending its potential for use in schools across a wider spectrum of the national curriculum.

The new historic map layer features mapping from the 1890s and covers the whole of Great Britain. Teachers and pupils will be able to overlay the historic maps over current mapping and compare changes in the landscape in their areas and beyond.

Digimap for Schools is an online application developed by EDINA at the University of Edinburgh. It gives schools easy access to a wide range of Ordnance Survey mapping using a simple login and password. The service is available to all pupils regardless of age. It allows schools to access a variety of mapping scales including Ordnance Survey’s most detailed OS MasterMap and the famous OS Explorer mapping at 1:25,000 scale which is ideal for outdoor activity.

The historic Ordnance Survey maps have been scanned and geo-referenced by the National Library of Scotland (NLS)and made available in Digimap for Schools. The maps were originally published between 1895 and 1899 as the Revised New Series in England and Wales and the 2nd Edition in Scotland. The historic maps are high quality scans at 400dpi for Scotland and 600dpi for England and Wales. This means that they can be enlarged far beyond their original scale of 1 inch to 1 mile.
OSElaine Owen, Education Manager at Ordnance Survey, added: “This new layer in Digimap for Schools is a fantastic resource for teachers and pupils of all ages, especially if they’re working on a local history project. The historic layer is viewable against a range of modern map scales up to 1:10,000 scale. You can access the maps via a slider bar that allows the contemporary map to be gradually faded away to reveal the historic map. We’ are adding some new history and geography resources to accompany the layer, including looking at how coastlines have changed over the last 120 years.”
Pupils and teachers using Digimap for Schools can save and print maps at A4 and A3 size. The maps can be printed as a historical map, or combined with the modern map at different transparency settings as a merged image. The full set of annotation tools are available for use on the historic map, providing many opportunities to highlight changes.
Since Digimap for Schools launched in 2010, the service has been adopted by over 20% of secondary schools. 
NLSChris Fleet, Senior Map Curator at NLS said “Old maps present our history in one of its most enthralling forms. We are delighted to be collaborating with Ordnance Survey and EDINA in delivering our historic maps to schools through the Digimap for Schools application.”
Peter Burnhill, Director of EDINA at the University of Edinburgh said “Students, pupils and their teachers now have unrivalled access to the very best maps to gain rich understanding of how Britain’s landscape has changed in over a century. The result is endlessly fascinating, the skill and generosity of staff at the National Library of Scotland have enabled a real sense of place when combined with the Ordnance Survey maps of today’s Britain.”

Digimap for Schools is open to all schools in Great Britain via an annual subscription. The subscription costs £69 for a primary school and up to £144 for a secondary school.
Posted in Data, Digimap, EDINA, Historic, Learning, Schools | Tagged , , , , , , , | Leave a comment

Inaugural Scottish QGIS user’s Group

QGIS UK

QGIS UK

“Today we have a guest blog post from one of the Geo-developers at EDINA.  Mike works as part of the data team and is usually up to his oxters in databases ensuring that the data offered through Digimap is both up to date and in a useful format. Over to Mike.”

Following on from successful meetings In England and Wales, on 19th March I attended the inaugural “Scottish QGIS User Group” hosted at Stirling University. My first thought revolved around  the level of interest that such a meeting would acquire, but as it turned out, it was very popular. I was also surprised at the geographical spread of the attendees, with several folks coming from Brighton (Lutra Consulting) and  Southampton (Ordnance Survey) as well as all over Scotland & northern England. Although the attendees were dominated by public sector organisations.

Talks/Presentations:

A more detailed breakdown of the presentations can be found here: http://ukqgis.wordpress.com/2014/03/25/scottish-qgis-user-group-overview/

From my own perspective, the talks on developing QGIS and Cartography in QGIS were of particular interest – demonstrating the every growing potential of QGIS. Additionally, the improvements (particularly speed enhancements)  that look to be coming soon (as highlighted in Martin Dobias’ presentation) are impressive.

As for the user group itself, it will be interesting to see where it goes from here and what direction it will take. How will future events be funded? How often should the group meetup? What location? A recommendation from myself would be to have general presentations and talks in the morning, then in the afternoon split into different streams for beginners / users / developers.

At the end of the meet-up (and a few geo-beers in the pub) there was definitely a sense that everybody got something out of the event and would like to attend more meetups in the future.

A special mention of thanks needs to go out to Ross McDonald - @mixedbredie (Angus Council) for his efforts to organise the event and additionally thinkWhere (formally Forth Valley GIS) for sponsoring the event.

Links and seful things

Posted in Cartography, GIS, Open Source, OSGeo | Tagged , , | Leave a comment

The search for Flight 370

flight370

courtesy of Wayan Vota (https://www.flickr.com/photos/dcmetroblogger/)

As the search for missing Malaysian Airways Flight 370 approaches it’s 5th week, the reliance of Geospatial technology and the skills to analyse large volumes of data are becoming increasingly clear. In this post we will look at some of the geospatial technology and techniques that have been used in the search for Flight 370.

Background

Flight-370 disappeared on the 8th of March 2014 having left Kuala Lumpur en-route for Beijing. There was simply no trace of it. Communications were lost somewhere over the Gulf of Thailand. Speculation quickly rose as to the fate of the aircraft with hijack and rouge pilots being muted as possible explanations.  A catastrophic break-up of the aircraft through an explosion was not ruled out but looked unlikely as this would generally be noticed. Furthermore, there was no sign of debris in the area of Flight 370 last known position.

Data feeds and extrapolation

After a few days, data started turning up that suggested that the plane had stayed aloft for several hours after all communication was lost.  Equipment onboard transmits information such as status updates and diagnostics.  The engineering teams can then monitor the health and performance of components while they are in use.

The engines had sent burst of data every hour and these had been picked up by a satellite operated by Inmarsat. By monitoring the Doppler effect in the received data, Inmarsat was able to chart 2 possible paths; one to the north and the other to the south.  This had never been done before and the innovative use of this data by Inmarsat allowed the rescue effort to be concentrated in 2 distinct areas.

After a bit of tweaking and refining, the Inmarsat scientists were able to discount the Northern corridor and the search re-focused on the Southern corridor, a vast expanse of ocean west of Australia with no suitable landing site.  How they achieved this was really quite clever. They used “truthing data” from other aircraft to monitor the Doppler effect and therefore refine their estimates for Flight 370. They then calculated the speed and altitude of the aircraft and were able to work out roughly where it would have run out of fuel and ditched into the ocean.  This greatly reduced the search area.

Satellite analysis

The search area had been focused to a small section of ocean (ok, so small in this case means the size of Western Europe, but given the size of the Southern Indian Ocean this can be considered to be small).  It was now feasible to start analysing aerial imagery to try and identify debris (given that there was nowhere for the plane to land, on the 24th March Malaysian officials announced that it was beyond reasonable doubt that the plane was lost after ditching in the Southern Indian Ocean). Trawling around to find out what satellites were used was harder than i thought it would be.  Below is a summary of what i found:

  • GAOFEN-1 – a high-resolution optical sensor run by CNSA (Chinese National Space Administration) which was launched in April 2013. Gaofen-1 is equipped with a 2 metre resolution CCD (Charge-Coupled Device), an 8 metre resolution multi-spectral scanner and 16 meter resolution wide-field multi-spectral imager. It is difficult to tell which sensor produced the image below, but from the resolution it looks like it was the 8m res multi-spectral scanner.
chinese satellite

Chinese satellite image of possible debris – Pic from The Guardian/Reuters

  • A French satellite operated by Airbus Defense and Space spotted 122 objects in a cluster. The objects were up to 23m in length and in a cluster. (image released by MOSTI). Airbus Defense and space have a host of satellites run through their Astrium including EnviSAT, CryoSAT, Copernicus, ELISA and Helios 2.
French

Airbus Defence Image

  • Australian AP-3C Orion – Orion aircraft were deployed to likely search areas and scanned the area.  It is likely that the crew were using a combination of electronic surveillance system and just their eyes. This might seem like old-school, but it is an effective method of verification as trained operators can discount or confirm sightings from remote sensing. The aircraft has a long-range and can fly low making it ideal for searching.

Ocean Currents

Why has it taken so long to refine the search area?  Well there are lots of satellites, but only a few of them would have had suitable sensors on-board. Data is collected and beamed back to a receiving centre. The raw data will most probably have to be processed before it can be used for anything.  This takes time.  The search area may well have been narrowed to a chunk of the southern Indian Ocean, but this still represents a huge area, not dissimilar to the size of Western Europe.  Processing and analysing data for such a large area is not easy and will rely on a degree of automation followed by humba verification.

The southern Ocean is a wild place with frequent storms. We can see from above the at optical sensors have been used and these will be unable to penetrate cloud cover. Scientists would have to wait for the satellite to pass over the same area to try and get a better, cloud-free image. The repeat cycle may be anything from 1 day to 10 days or more.

Then you add in the ocean currents.  Anything object floating in the ocean will not be static and could drift by 10′s of kilometres a day. Given that the plane is likely to have crashed 15 days previously, debris could be 100′s of kilometers from the crash site. That is, if it has not already broken up and sunk.  But we can at least model the ocean currents and estimate the potential dispersal of the debris.  The NY Times have some excellent visualisations of both the currents and the wave heights in the southern Indian Ocean during March.  These have been produced by the National Oceanic and Atmospheric Administration and the National Centers for Environmental Prediction through remote sensing data, in-situ data (buoys) and models.  While never 100% accurate, they provide an indication and convey the uncertainty involved in determining a search area.

Locating flight recorders

Once a search area has been identified, the searchers are able to deploy listening devices which locate “pings” emitted by Flight 370′s black box. This is achieved by towing a listening device (TLP-25) back and forth across a wide area.  Pings should be received periodically and the position and strength of these should triangulate the position of the black box. But the sea floor is not flat in this area. It is around 4500m deep with mountains up to 2500m high.  We actually know very little about remote ocean sea beds.  We have limited data collected by ships and most representations come from spaceborne remote sensing data. These are not very accurate and may “miss” large structures (1-2km high) such as seamounts. There is nice overview of ocean mapping on the BBC website.

The difficulties of retrieving debris from deep, remote oceans was highlighted by the search for French Airlines flight 447.  In this case, both black box transmitters failed.

A Chinese ship detected a ping on the 5th April and a day later an Australian ship detected a ping.  But the pings were quite far apart.  The Australian ships detection seemed more consistent and stronger and this was backed up by more detections in the same area on the 8th. It is a slow process, but each detection should help reduce the uncertainty.  The question is, will the batteries in the transponders last much longer?  They are already at the limit of what is expected so time is running out.

Remote Sensing Critical

It is clear that remote sensing technology has been critical in every stage of the search for Flight 370.  It will continue to be so until the plane is found.  It has been used effectively to narrow search areas and discount blind alleys. It is also interesting to note how associated data has been used in ways that it was not intended to locate the plane and praise should be given to the Inmarsat scientists who came up with a novel solution when data and information was scarce.

Articles:

  • The search for Malaysian Airlines Flight 370 – a great article in the New York Times that focuses on the remote sensing data that is being used now that search teams have identified a “likely” crash site in the Southern Indian Ocean.
  • Wikipedia – a growing resource for information about Flight 370
  • Wikipedia - French Airways flight 447
  • NY Times – nice collection of visualisations of ocean conditions in the search area
Posted in Remote Sensing | Tagged , , , , , , , , | Leave a comment

Create a linear buffer tool for Digimap for Schools

I’m working on the development of a new linear buffer tool for the Digimap for Schools service. Linear buffering is a common feature in GIS applications.

line-buffer-example

200 meters buffer on a part of river clyde in Glasgow

In geometrical terms such an operation on polygons is also known as Minkowski sum and offsetting.

I was looking of a Javascript library that would offer such functionality as OpenLayers2.13, currently used by Digimap for Schools, does not offer this as part of its codebase.

I came across 2 libraries that would offer this sort of functionality. One is JSTS andjsclipper the former being a port of the famous Java JTS Topology suite and the later being a port of the C++, C# and Delpi Clipper. I finally decided to go for jsclipper due to being unable to build a custom cut-down version of the huge JSTS library.

The resulting tool made use of jsclipper to calculate the buffer polygon along with OpenLayers, used to draw the buffer polygon and the inner linear path.

A standalone example along with the code, making use of EDINA’s OpenStream service, can be found here:  (Full screen here)

One of the challenges encountered was jaggy rounded ends on low buffer widths which is due to the way jsclipper handles floats. Fortunately jsclipper provides a method to scale up coordinates before passing them to jsclipper for offsetting and then scaling them down again before drawing. The Lighten and CleanPolygons functions also provided a way to remove unnecessary points and merge too-near points of the resulting buffer polygon.

All in all, jsclipper is a light, fast and robust library for polygon offsetting and would recommend having a look at it: https://sourceforge.net/p/jsclipper

Posted in Code, GIS, Python, Schools | Leave a comment

COBWEB Meets the Commissioner at GEO-X

Nicola Osborne, from the COBWEB project team, explaining all five Citizens Observatories projects to the European Commissioner for the Environment, Janez Potočnik.

Last week members of the COBWEB Project team attended the GEO-X Plenary and Geneva Ministerial Summit event in Geneva, Switzerland. GEO, the Group on Earth Observation, who have held annual plenary meetings since the organisation was established in 2005. GEO-X marked a particularly significant milestone with participants looking forward to the next 10 years of activity across GEO and GEOSS(the Global Earth Observation System of Systems) .  COBWEB was therefore delighted to be invited to be part of the European Commission delegation at the GEO-X exhibition, and to be able to take up opportunities to give a Speakers Corner talk on the project, to speak at the Citizens’ Observatories side event, to speak at the AIP-6 Side Event, and to show posters at the OGC stand including both a poster and video which was shown in the European Commission Speakers Corner area.

Chris Higgins talks about COBWEB during the Citizens Observatories side event.

Chris Higgins talks about COBWEB during the Citizens Observatories side event.

The project was well represented by members of the COBWEB team covering technical, organisational, stakeholder engagement and dissemination activities, which enabled very productive discussions and networking to take place throughout the week. COBWEB’s Architecture Implementation Pilot (AIP) -6 achievements in enabling single sign on access across the COBWEB access management federation triggered some great conversations for Andreas Matheus, Bart De Lathouwer and project coordinator Chris Higgins. This contribution to the wider GEO community was also acknowledged in a GEOSS showcase film shown as part of the Ministerial Summit.

COBWEB’s development and infrastructure was highlighted in Chris Higgins’ talk during the Citizens Observatories side event, with a particular call made to our fellow Observatories’ to discuss and collaborate around interoperability across the projects. Meanwhile our stakeholder engagement work to date, including collaborations with organisations such as Dyfi Woodlands, was the focus of our Speakers Corner talk by Jamie Williams.

Jamie Williams gives his Speakers Corner talk at GEO-X

Jamie Williams gives his Speakers Corner talk at GEO-X

Citizens Observatories emerged as a key consideration for future environmental policy making and the GEOSS representatives were keen to ensure interoperability and access to citizen science efforts in generating environmental data in the interests of local, regional and global efforts to ameliorate environmental impact. Citizens are key to the European Commission vision as articulated in HORIZON2020 and therefore the initial tranche of Citizen Observatory projects, of which COBWEB is one of five, are trailblazing the future shape of citizen involvement as stakeholder in environmental policy.

The COBWEB project was present at the European Commission exhibition area through a shared stand with all of the Citizens’ Observatories projects: COBWEB,Citi-SenseWeSenseItOmniscientis, and Citclops. Three collaborative posters were created especially for this space, outlining all of the projects in more detail and highlighting areas of commonality across the projects. We were excited to meet GEO Plenary participants and fellow exhibitors throughout the week – with over 89 GEO member nations present this was a truly diverse group. However, the highlight of the exhibition was the visit to the stand on Friday 17th January byJanez Potočnik, the European Commissioner for the Environment.  The image at the top of this article (taken by Stuart Wrigley from the WeSenseIt project) shows Nicola Osborne, from the COBWEB project team, explaining all five Citizens Observatories projects to the Commissioner.

For the team, the event also presented a wealth of opportunities to meet, learn from and exchange ideas with projects and organisations from across the world, and particularly to make connections to complementary activities funded by the European Union. It was a hugely useful week and we look forward to sharing more from the event, including our presentations, posters, and a brief video on the AIP-6 work showcased, over the following weeks.

Find out more:

To find out about what COBWEB is and how EDINA is involved, please check out the COBWEB site.

Original blog article - COBWEB Meets the Commissioner at GEO-X

Posted in Conference, Inspire | Tagged , , , | Leave a comment

Space Charter activated in response to UK flooding

Flood Warning

Spotted on the BBC News page, the recent flooding in the UK (yes, it has been even wetter than normal over here!) has prompted the UK Government to activate the global charter on space and natural disasters. This essentially gives government agencies access to the most up-to-date imagery of affected areas allowing them to plan relief and contingencies.

There is no end in sight for the bad weather which is being driven by a very strong jet-stream. This has resulted in a number of deep depressions passing over the UK battering the coasts and dropping lots of precipitation.  Depressions are not unusual at this time of year, but the frequency and intensity has given the rivers little time to recover before the next assault.

Useful links:

Flood

Flood – Pic courtesy of Johndal (http://www.flickr.com/photos/johndal)

Posted in Remote Sensing | Tagged , , | Leave a comment

Top 5 geo books for Xmas

Following up on yesterdays post about presents for geo-geeks, here is a list of books that geo-geeks might like to receive this Christmas. There has been a recent flurry of map related books and this list will focus on these more mainstream publications rather than the technical titles you might find in the “Books” section of GoGeo.

1. Around the world atlas:  this looks like a great modern take on the classic atlas for children.  Bright, colourful and full of interesting facts represented by infographics. Price: £32

Atlas

Around the world in 80 pages

2. Maps: This book of maps is a visual feast for readers of all ages, with lavishly drawn illustrations from the incomparable Mizielinskis. The maps show not only borders, cities, rivers, and peaks, but also places of historical and cultural interest, eminent personalities, iconic animals and plants, cultural events and many more fascinating facts associated with every region. Price £11.

Maps

Maps and more

3. Pocket Atlas of Remote Islands:  A small book that only contains maps of remote islands.  Many of which you have never heard of, and that you will probably never get the chance to visit.  The cartography is quite simple, but that is the beauty of this book. Most of the islands could easily be Treasure Island if you allow your imagination to run aways a bit. Price £10

Islands

Pocket Islands

4. The Lands of Ice and Fire: If you are into the Game of Thrones then this is a must. A dazzling set of maps, featuring original artwork from illustrator and cartographer Jonathan Roberts, transforms Martin’s epic saga into a world as fully realized as the one around us. Price £20.

FireandIce

Game of thrones

5. From Here to There: A series of hand-drawn maps that map both real and imaginary places as well as some slightly “off-the-wall” maps.  From Here to There bridges cartography and art. Price £10

tothere

Hand drawn maps

 

Posted in Cartography | Tagged , , , | Leave a comment

Top 10 Christmas ideas for geo-geeks

It’s that time of year again.  Finding the perfect gift for those that we care about and trying to be just a little bit different or going that extra mile to get a gift that really will be cherished.  Increasingly, us map geeks have a host of carto related gifts that we can buy each other.  This post highlights some of them and will hopefully stand as a bit of inspiration for anyone who wants to treat a geo-geek who has been particularly good this year.

First off, you could do worse than looking at the list that i put together last year. Some of the links no longer map(sic) to the correct product, but some do.  Beyond that, the list below should give you some ideas of what i have stumbled across this year:

1. SplashMaps - SplashMaps are REAL outdoor maps designed for clarity and accuracy.  They are Washable, Wearable, All-Weather Fabric maps.  Prices are about £20 for a standard sheet and bit more for a custom Make-a-Map sheet.  Note that Make-a-Map maps cannot be delivered before Christmas.  Please choose the Map voucher for an even more personalised gift. There is a useful image showing the availability of standard maps that would be delivered by christmas.

splash

Splash Maps

 

2. Escape  maps- Available through many sellers on ebay are genuine WW2 escape maps. These tend to be printed on silk and were issued to service men, generally RAF, when they were on missions behind enemy lines. The maps were light and easy to hide in clothing but gave service men routes to escape back to allied territory.  I may have bought one last year and it is beautiful! Buy one and you will own a little piece of history. Prices vary, but around £25.

escape

A little bit of history

3. Jerrys Map – If you don’t know what Jerry’s map is, then please look at this video.  You can buy tiles from the map, these are copies of the original, but as the map is constantly evolving you actually end up with something that is unique. Again, i may have already bought some tiles and they are amazing.  If you happen to be in Edinburgh (perhaps over Hogmanay) then you can see the actual map which is on show at Summerhall.  Tiles are available through Ebay direct from the man himself. Prices vary, but start at less than £5.

JerrysMap

Jerry and the map

4. Bespoke map art- Yes, i know they featured last year, but they have expanded their range and there is something for every type of map geek. Prices range from 35 to over £50.

MapArt

Map Art – in this case a clock!

5. Cartographic T-shirt – stylish -shirts with subtle cartography on them. Nonfictiontees – £12, TFL Beck Tube map – £12, Polar projection goodness – £16,

tshirt

Map on a t-shirt

6. Animal World Map- This is a massive wall map of the World for kids.  Each country is represented by the animals that are associated with it. Although the UK seems not to get any animals, just Big Ben. Surely we could have has sheep, salmon, a Highland Coo or a Haggis! Prices are £21 for an A3 copy and £26 for an A2 copy.

animal

One for the kids

7. Typographic Maps- another “where art meets maps”, these typographic maps show several cities of the World (London, New York, Seattle…..) but features are represented just by their names.  The A1 posters cost £26.

Typo

Typo? What typo?

8. Map Bling- Jewellery with maps on it for him or her?  Sorted. Prices from £15-£50.

Bling

Bling

9. Wapenmap- A Wapenmap is a 3D contoured stainless steel metal map landscape sculpture. Cost is about £18

3D

3D Maps

10. A map – any map.  A true map geek will get a kick out of receiving a map.  The map could be new, old, antique, foreign.  It could link into a trip or anything.  I have received a Terry Wogan weekender map (dont ask), and it is great.  I have given people old maps of where they live and taken pleasure as they analyse how things have changed (Oxfam is a great place to get an old OS or Bartholwmews map)

That’s your lot.  What, no books i hear you say! Well i will put a list of some top map related books tomorrow.

Note – thanks to James Cheshire who blogged his wish list earlier in the month and i have blatantly stolen 1 or 2 ideas from it ;)

Posted in Uncategorized | Tagged , , , | Leave a comment

Creating a transparent overlay map with mapbox-ios-sdk

For this blog post i have managed to capture on of EDINA’s mobile developers.  Their guest article will describe how to create transparent overlays for mobiles using mapbox-ios-sdk.

I am working on a historic map overlay, where the user can adjust the transparency of the historic map. The user can then see the how the land use has changed over time by using the slider.

opacity-map-overlay

I am going to use the map-box fork of route me. Looks like a good bug fixed version of Route-me and map-box do seem to have great some great products.

Unfortunately it doesn’t have an have an API to dynamically change the opacity of a tile source out the box. So I added it.

Its pretty easy to add. Each tileSource has a RMMapTileLayerView container when added to the map. Within that can manipulate the CALayer.opacity to get the desired effect.

I added a fork to github for testing

https://github.com/murrayk/mapbox-ios-sdk/

And example of use – the code is in github. Do a ‘git clone –recursive’ to install the submodules.

https://github.com/murrayk/mapbox-overlay-opacity-example

And example of use. In the  main view controller.

- (void)viewDidLoad
{
    [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    RMOpenStreetMapSource * openStreetMap = [[RMOpenStreetMapSource alloc] init];
    RMGenericMapSource * weatherMap = [[RMGenericMapSource alloc] initWithHost:@"tile.openweathermap.org/map/clouds" tileCacheKey:@"cloudCover" minZoom:0 maxZoom:18];

    self.mapView.tileSource = openStreetMap;

    [self.mapView addTileSource:weatherMap];

    self.overlay = weatherMap;
    // rough bb W = -30.0 degrees; E = 50.0 degrees; S = +35.0 degrees; N = +70.0 degrees
    NSLog(@"zooming to europe");
    CLLocationCoordinate2D northEastEurope = CLLocationCoordinate2DMake(70,-30);
    CLLocationCoordinate2D southWestEurope= CLLocationCoordinate2DMake(35,50);
    [self.mapView zoomWithLatitudeLongitudeBoundsSouthWest:southWestEurope northEast:northEastEurope animated:YES];

    [self.mapView setOpacity:0.5 forTileSource: self.overlay];

}

//hook up a slider to manipulate the opacity.  

- (IBAction)changeOverlayOpacity:(UISlider *)sender {

    NSLog(@"Slider value changed %f", sender.value );
    [self.mapView setOpacity:sender.value forTileSource: self.overlay];
}
If you found this blog useful, you might want to look through the archived articles on EDINA’s developers Geo-Mobile blog

 

Posted in Cartography, Code, EDINA, GIS, Historic, Mobile | Tagged , , , , , | Leave a comment