I’ve been using the PeakFinder app for a month or two now. It is a nice app for showing what hills are in view. Basically it give a ‘live’ wireframe of hilsl from your location or anywhere you like. All the features are listed PeakFinder App.
Today I opened the app and it must have been updated, because it gave me a message saying:
For a long time many of you have asked for an option to combine the image of the camera with the panorama drawing. l’ve finally implemented this feature in this newest version and so PeakFinder now also supports true augmented reality.
This is quite amazing, and in my tests it works a treat.
I think this is the first AR I’ve seen that makes be think this could really be useful and soon. It is not much of a stretch to imagine a botany app that can recognise flowers.
What is cool about peakfinder is that the data is loaded so that you do not need a connection to use the application.
I occasionally make simple mashup of of gpx, google maps and flickr photos of walks. I record gpx on the Trails app on my phone, take photos with the phone too as they are nicely geo tagged and flickr can use that information and provide in the API 1.
One of the things I noticed was that the GPX files can be pretty big, over a megabyte each. I know there was probably a lot of information in the file that was not needed to display the path on the map but was not sure of how to do so easily. I think I’ve used online services for this before. Finding a site, uploading a file and downloading is a lot of bother for something that I hope will be quick and simple. I also expect that the audience for the pages produced is one.
Having a look inside the gpx files I though that you could probably slim them down considerably, each point is recorded like this:
I’ve been interested in combining maps and media for a while now. Here is a recap of some of the methods I’ve been using. I’ve not often had the chance to do this sort of thing in a teaching situation but continue to believe that mapping media would be a valuable way to record experiences for pupils and a nice slant on digital storytelling.
Last Sunday I had a walk to Benvane & Ben Ledi recorded the gpx with the iPhone Trails app (one of my top 10 apps) took photos, video and some panoramas. Here are the three ways I’ve been developing of displaying them on the web. None of these are good as examples of story telling as I am still thinking about the workflow and tech.
Photos on the map
I’ve built up a fair collection of these over the last few years. this one only uses iPhone photos which means I can skip the stage of matching photos to the gpx file. When I started doing these google maps API was at version 1, I move to 2 and now am behind version 3.
This is the most conventional story combining an image with text in a liner fashion along the track.
I am hoping that this can produce a more contemplative result.
Although I’ve only just worked out how to do this the workflow is a lot simpler than the photo maps. I’ve developed a mac application (using SuperCard) to make these. All I need to do is to drag some iphone videos out of iPhoto onto the application and it creates the smaller versions of the video and the HTML to display them along side the maps
I you have a mac and would be interested in trying the app, let me know.
This is the most recent development, after tweeting about the Video Maps @drewburrett suggested using photosynth for the iPhone to take pano photos and do something similar. I’ve not got a workflow for creating these and don’t think I’ve got the display method right yet but I am quite excited about working out different ways to present pano photos.
As I said I’ve been messing with maps and media for a long time (2006 example) I’ve blogged about it a fair bit, pretty much in a vacuum. I’d be really interested in finding some folk to play along with or a school interested in trying out some of this stuff.
I’ve blogged before about the wonderful Hmsg Spiral Map a project that combines video, audio and google maps into a mesmerising meditative experience.
Recently I noticed that iPhoto shows the location of videos as well as audio which got me thinking a wee bit. I checked out a few exif tools and found that the location was stored in exif data in the same way as photos.
I already had made some crude tools to map walks on google maps and made an odd foray into adding sounds to the photos: burn, so though I might be able to knit together some video and maps.
The list of movies and locations are loaded from an xml file that is a very simple list: <item><file>loch_humprey_02.m4v</file><loc>55.9323,-004.4594</loc><dc> 2011:08:02 21:27:10</dc></item> I though xml was a good idea as it would allow reuse to display the movie in different ways. As the movies are shown the location is used to show a couple of images using the google maps static api. This first Video Map Experiment was cobbled together using a couple of command line tools (pcastaction, built into Mac OS X and ExifTool by Phil Harvey). I am not knowledgeable about shell stuff but it can often help do interesting things and once you figure it out is easy to reuse.
Drag videos from iphoto on to a field in a SuperCard project I’ve made.
Click a button on said project which:
Asks me to choose a folder
Gathers locations & date/time from the video files
Makes a copy of videos in the folder, shrinking file size & dimensions (this take a few minutes)
Creates an xml file & and index.html file in the folder to show videos
I then upload folder to server via ftp.
I hope this could be an interesting way to tell a story, record a trip or describe a place. I’d be interested to know what other folk think.
On Friday evening I went along to the evening presentation and discussion part of this event. There had been an afternoon training workshop on the practicalities of field recording run by the evenings presenters which was limited in numbers. My attention to the event had been aroused by a tweet from @scottishmusic I guess because I post the odd recording to the UK Sound Map. The evening was a little bit different from the educational conference/teachmeet meetings that I am more likely to be found at but I am really glad I went.
The first presentation was by theatre maker Tim Nunn of the company Reeling & Writhing. He spoke about his work in progress Formel, inspired by Chaucer’s Assembly of Fowls the play uses field recordings extensively. Tim spoke about how he wrote the play working back and forward between text & field recording each affecting the other. A lot of the sound was recorded on Islay and he played us fragments of a force 10 gale and rooks mobbing an eagle. Here is a taste of the play I found on the Formel page
As someone who is as musical as a turnip I can’t really comment on the work, other than to say I enjoyed listening to it. The Blast Beach images & sound was interesting in being a very much polished combination of photos and edited found sound. As someone who takes phone photos and records the odd sound when out and about it shows where the idea can go give a deal of talent in photography & music.
Ian Rawes spoke about several sound map mashups. Starting with his own London Sound Survey which features London maps, ambient sound recordings, sound maps, local history & London wildlife. This is a site to get lost in, the quote on the front page Perhaps the most ambitious and comprehensive approach to sound mapping I’ve yet to see . . . an all-around wonderful site! (from Jim Cummings, Acoustic Ecology Institute) rather understates it! Ian is the Vault Keeper at the British Library sound archive (I think). Ian briefly showed us round the London Sound Survey playing recording of a street preacher (there are quite a few) and a Common Pipistrelle bat recorded with a Magenta heterodyne bat detector set to 45 kHz and Edirol R09-Hr digital recorder which gives you an idea of the range of the site if not the depth. He also showed us the London map with present-day streetmaps, historical maps and sound recordings.
Ian is also the person behind the UK Sound Map , I’ve blogged about this before and contributed a few boos to the map. what I love about the project is the fact that it is open to anybody to contribute, it is easy to do so and it mashes maps & sound.
We then saw the Acoustic map from 12 Gates to the CityThe acoustic map is an ever-growing collection of 1 to 5 minute sound recordings embedded on a world map at the exact location of each recording. created by Jonathan Prior, an Edinburgh-based creative researcher, who was sitting in the audience. Johnathan’s map is interesting because it uses UMapper rather than google maps, it looks and sounds good. We heard the underwater recording of periwinkles grazing on algae which sound nothing like you would expect. It looks as if there is a lot of interesting stuff on 12 Gates to the City.
The Inukjuak Sound Map is another map sound mashup this time created by Montreal sound artist Nimalan Yoganathan. The map has cultural and natural sounds, some with images. It uses google maps. We also watched Charles Veasey’s Hmsg Spiral Map which I had seen before, but it was interesting watching with other people on a large screen rather than in one’s own home with multiple on and offline distractions. The Hmsg project is a flash/video/google maps mashup.
One of the main impressions I got from the event was the quality of the audience’s listening, this made the evening quite quiet and contemplative, quite different from, say, a TeachMeet or educational tech event. I had not taken a laptop or ipad to take notes, but if I had I would not have used them, I didn’t take any photos either although I had a phone with me. In googling the links for this post I re read Inukjuak Sound Map and Hmsg Spiral Map on Ian’s London Sound Survey blog:
The Spiral Map looks and sounds very impressive as it progresses smoothly through its 30 different sound recordings and videos. Most of the videos have very little motion in them and much more action is heard than seen. It’s a great way to set a balance between the ravenous eye and the patient ear.
I came away straight after the event, an empty stomach and dinner waiting kept me from the pub, with open ears, walking to the train station listening more than usual. At the station I was surprised that the announcement and the clicking and clacking of the high heeled shoes were louder than the train.
I’ve also been thinking of how this could relate to the classroom. Here are some ideas off the top of my head:
I’ve often used photos and sometimes video as stimulation for creative writing, following Tim Nunn we could add recordings as a great stimulus. Children recording sounds from a trip as well as taking photos and videos could be an powerful addition to stimulate writing and discussion back in the classroom. I’ve also had children record poetry with backing music, perhaps found sound could be used as well.
I’ve often involved pupils in creating movies from still pictures adding their voice with iMovie. It could be really interesting to add recordings Timothy Cooper style I think some children could be excited in working this way.
I’ve been building picture and gps map mashups for a while now, occasionally incorporating audio and recently mapped my boos this could easily be adapted for a school trip or for a collection of schools to work together. Or perhaps schools could contribute to UKSM itself. Playground sounds across an authority or skipping songs could be a starter.
In the afternoon workshop there had, I think, been a lot more technical information. Ian provides a Budget binaural stereo microphones guide on London Sound Survey. A lot of the Field Recording crowd seem to know what they are talking about kit wise, I was somewhat relieved when Ian appropriated the Best Camera quote: The best microphone is the one you have with you.
In talking about UMapper, Ian said it was in some ways easier to use than google maps. This is probably right, but I like the way google maps can be use to auto update, using the api, so that things are added without automatically, without crafting
Finally what I’d take away was the quality of listening shown by the audience & presenters. The time taken. Timothy Cooper’s Blast beach gave plenty of time to look at the images: audio can be slower. I am thinking again about Ian Rawes’ “the ravenous eye and the patient ear”, Tim Nunn’s theatre performances in the dark.
From the above you can see I’ve gathered a great number of links, sites not only to visit but to revisit. It is not often you get the chance to hear periwinkles eating.
Audioboo must be one of the simplest ways to do audio podcasting and it has many nice features. One of the ones that interests me most is the fact that the RSS feed has geo information in it, that is the location that the boo was recorded in (users can I believe turn this off). I have played about with the google maps api in simple ways (eg some walks) and really like the ability to tell a story in space as well as time.
Yesterday I though I’d have a look at the Audio Boo RSS feed (atom really) and see if I could do something similar.
Since google maps support GEORSS I though I’d give that a try first, pasting my audioboo feed into the search box on google maps give me this map which shows the boos without the audio players (no flash support).
I am caching the rss feed from AudioBoo so updates might not appear. It would be easy enough to set this up so that the page would load boos from a user or tag in the url /boo.php?tag=thetag or /boo.php?user=user too, but might effect my bandwidth.
The other interesting thing I found was Shadowmaker a webpage that makes shadows for google map icons which is a nice touch. On the walk maps I never got round to doing that. Shadowmaker makes it so easy that I could not avoid it.
Anyway I think this would be a really nice way to podcast a school trip, once a page like this is in place it will reflect any updates to AudioBoo. You could also make one that would map a boos that were tagged with a particular tag although I don’t think AudioBoo had feeds for tags from a particular user?
I’d love to hear from anyone with a class or school interested in a project like this and lend a hand.
Yest another mapping/iphone post. This might not seem like education but I consider the mapping of walks etc. a sort of trial for possible Teaching and learning activities. At Sandaig I was always interested in blogging trips (Sandaig Netherlands 2008 or Glencoe 06 for example). I am interested in trying to get pupils and groups to tell stories in different ways, audio, text, pictures and video adding location into the mix seems like a good idea. This week i was talking to some of the instructors at Kilbowie Residential Outdoor Centre Oban discussing some of the potential for adding some more ict into their mix through Glow.
On Friday I was going for a walk and decided to try a few different ways of recording the walk centred around the iPhone.
As usual I recorded a gpx file and took some photos with the phone for A Mapped Walk
I also took other pictures with my camera and geotagged them once I got home with gpicsyncsuggested by Dan Stucke in a comment here. gpicsync is a visual front end to exiftool that I’ve mentioned before and works well, unfortunately my iPhone battery gave up early as I was using lots of apps, but a few were mapped by Flickr. The rest taken on the way bak down are untagged.
At the top of the hill i decided to try audioBoo. I love the way Audioboo combines a picture, the audio and a wee map and is simple to use. Unfortunately I didn’t have a good enough signal to post the boo from the hill.
Instead I turned to posterous. The really good thing about posterous on the iphone is that because it used email you don’t need a signal, the mail app will just wait until it gets one and sends the mail. I found this out on my holiday this year when I seemed to get an occasional signal overnight, making posterous the easiest way to blog. I’ve also found out how to combine images and audio in an email from the iPhone and because posterous now geo locates your post if there is a location in the exif data of any images posted you get the same effect as audioboo. See Ben Donich – John’s posterous.
The trick is, take a photo, switch to the camera roll and click the share/mail icon. choose the picture and copy it (This will work with several images). Then open up the Voice memos app, recods some audio and then mail it. You can paste the image(s) into your mail and send.
The last thing I tried was the lifecasting app iTunes url, this allows you to choose some photos and then record a narration over a slideshow of the images. The result can be uploaded to youtube or downloaded to your desktop as a m4v file (the app like many others acts like a wee server and puts up a webpage with the movies to download.)
Lifecasting works fairly well, the fact you cannot mail the file is a pity. The other problem is that the slides are shown for a fixed length of time, the example below is the longest, so you have to fit your audio to the show. I did duplicate a couple of images to give myself longer to talk. If the slides could be set to last the length of the audio and you could use mail or the metaweblogAPI to upload them this would be a great app for mobile learning.
I am continuing to use ExifTool by Phil Harvey for geotagging photos. Is is a wonderful application that has many more features that the basic use I am making of it. Once installed it is simple to use even if you do not usually use commandLine stuff. A quick example using a mac:
Photos in a folder on my desktop called ‘kilsyth hills’
gpx track on the desktop
Open the Terminal application (found in the Utilities folder)
In the terminal you type: exiftool -geotag into the terminal, then drag the gpx track file onto the terminal window, followed by the folder of images, you end up with something like this: exiftool -geotag /Users/johnjohn/Desktop/Lecket-hill.gpx /Users/johnjohn/Desktop/kilsyth hills in the terminal window, all you need to do is hit return and the application runs through the photos and calculates the gps location of each from the time taken and the gpx track.
If then import the images into iPhoto you can click the info icon on a photos thumbnail to see something like this:
And this info will put the photos on the map if you upload them to flickr.
I am continuing to mess about with this stuff. The last walk I mapped and tweeted brought a ton of great geo information from Geography teacher Kenny73 who blogs at Odblog one post: Odblog: Cellspin in the field covers using cell phone gps and everytrail to produce maps like Ben Vane at EveryTrail and school grounds. The iPhone app I use for creating gpx tracks, trails is integrated with everytrail. This is a simpler way of creating map/gpx/photo mashups than mine but I’ll continue with mine for fun and a bit of flexibility (I can add sounds and video etc to maps). Some sort of everytails system would be easier to to use with a class.
Finally on the geo front I heard about the Eye-Fi Share Wireless 2GB SD Card has been added to the UK amazon store. It went out of stock almost immediately but its features include geotagging of photos! I am not sure if it used gps satellites for this of info about wireless networks to figure out the location, but if it is the former it will stay at the top of my wish list.
Update 02.08.09 On the ADE list David Baugh let me know The eye-fi explore uses triangulation of wireless access points and
mobile towers. not so useful for walks away from mobile and wifi coverage then.
Since my last walk map post I’ve made a bit of progress. I am now using Trails an iPhone app that:
allows you to record, import and export tracks onto your iPhone.
Trails is really nice, it records and show position and altitude. It also allows you to cache map tiles when you have a good connection to use later on a walk. You can zoom in quite close and it has already been handy in finding out I was going the wrong way in the mist.
Trails allows you to email a track in both kml and gpx format. Clicking the kml file opens the trail in GoogleEarth.
I’ve been using GPSPhotoLinker a free app to add geotags to photos using the gpx track from Trails. Once you have dome that they will be mapped by flickr.
I’ve then been using .SuperCard to read the data from the photos and the gpx track and produce an xml file and set of resized photos. The xml files can be used with the google maps api to show the track and photos on a google map.
I’ve started to put together some webpages to list and show the maps: Mapped Walks.
The idea is to end up with a SuperCard project that cuts out some of the steps, it would take in photos and gpx file and upload resized photos and xml file to the web. I just need a bit of time to write and test the scripts.
I have managed to add an mp3 player to some of the google bubbles on one map that plays sound recorded on my phone. The aim is to have pictures, audio video and text. The maps now also have links in the bubbles that take you from one to the next in the correct order, I think this could be come an interesting way to tell a story that travels through space and time.