I posted this this afternoon:

The Glow Blogs migration export will start on 19th September 2014

Any new posts or content added to blogs after that date will not be migrated to the new service.

The new service should be up and running by Oct 3rd 2014

The data from the current blog servers will be exported on the 19th of September and migrated to the new service ready for the go live date.

We’re making every effort to achieve the deadline of the new service for 3rd October. If anything changes, we will get in touch immediately.

This is not technically a content freeze as users will be able to add to their blog, rather it should be considered as a procedural content freeze.

We hope to be able to add a message to explain the situation to every blog dashboard but in case that is not technically possible we need as much help as we can get in spreading the word.

from: Glow Blogs Migration News | Glow Connect, Glow Connect is the information portal for Glow – a space for providing updates on the development and enhancement of the service and for sharing how teachers are using Glow.

Joshi by Juan Coloma Attribution-NonCommercial-ShareAlike License

I am sorry that the warning time for the content freeze is only just over two weeks, but it is only now we can give an estimate, timing is tight, it might even slip a wee bit, but we though it best to try and give as much notice as we can.

Back in June we proposed a content freeze over a short period over the summer and possibly another in September or October, as it turned out the summer freeze did not happen as it would not have given us any advantage, we could do a text export without a freeze.

We (especially the coders, technical and test members of the team) are very much working flat out to get the blogs migration in on time, the date is the current best estimate of when we will be ready to export the data from the RM servers and import it into the new one.

Earlier we hoped that the freeze would be a bit shorter that we are now estimating, but it has become apparent that it will take a bit longer. Between the 19th and go live several things need to happen:

  • The database and web server files (images and uploads) need to be encrypted and copied to a secure disk. Before encryption a sort of fingerprint of the files is taken, this will let us know if the files we put on the new server are identical. The size of the database and files meant this will take a while.
  • The disk will be taken to the new hosting and copied onto the server.
  • It is then unencrypted and the md5 fingerprint compared to the original.
  • The files will be put in place and hooked up to WordPress, or rather 33 instances of WordPress one for each Local Authority plus a central one.
  • Lots of testing. Testing of other bits of the process, the new servers has already started.
  • There are several rounds of testing, of different types that I am just beginning to get my head round. This will insure we get the best possible service from the new blog. The final rounds of testing will involve users from across Scotland, first on a ‘test’ environment and then on the new server before it goes live.
  • After everything looks good the new server gets the old blogs.glowscotland.org.uk domain and the blogs will be updatable again.

I’ve not numbered the points above because I have missed out many more steps. The project plan has been worked over repeatedly to make sure the quality of the result is as good as can be and, by doing various things in parallel, we cut time down to the minimum. Extra test engineers have been borrowed from other parts of glow and other members of the team are helping with testing.

After the new service goes live the project will not stop, the blogs will then be upgraded to a current version of WordPress and then the third phase of the project, to enhance the blogs for learning and teaching will start.

super ruper by nnnnic Attribution-NonCommercial-NoDerivs License

I’ve been trying to post information about the glow blogs migration here when I can, but still getting a few questions, via email, twitter etc. Here is a series of DMs:

Hi John. Been following the progress of GlowBlogs and reading your own blog. Question: Can I go ahead and set up a class blog using…

‘Old Glow’ and get class using it….then it’ll transfer across to ‘New-Glow’ with the bells and whistles in he coming weeks (months) ?

I (and other member of staff) really want to get cracking on this. How would we ensure the old style blog ‘goes accross’? Need to tell..

someone where it is?

Quite a few folk have asked the same sort of thing, can/should I set up a glow blog/e-portfolio now or wait?

The answer is: Yes if you set up a glow blog now it will be migrated to the new service.

Caveats

There will be a procedural content freeze, and the possibility of downtime if we do not make the 3rd of October deadline (we are working very hard to ensure we will).

Content Freeze

The database and files that make up the blogs are currently on RM servers, this need to be moved to new servers. Given the size of the data this will involve copying onto a portable disk. The copy will be encrypted. The disks need to be moved, the encrypted data securely moved to the new setup, unencrypted and verified. The new system then need to be thoroughly tested.

During this time the old blogs will be up and running, but any content added to them will not be migrated and new blogs setup during the content freeze through the old glow Sharepoint portal will not be migrated.

I am not sure how long the content freeze will be but it looks like being a week or so.

We will publicize the content freeze as much as possible, telling Glow Key Contacts in each Local Authority, publishing on Glow Connect and I’ll post here and tweet.

We also hope to be able to add a warning message on the dashboards of all the current glow blogs, but that solution needs to be created and tested.

Clarify icon

I’ve spent a fair bit of time when working at North Lanarkshire and back in school in creating howto instructions for software or computer tasks. Generally this involves organizing a bunch of screenshots and text on a page. I usually use Pages (or sometime comiclife) occasionally Word. I’ve though of myself as quite competent in grabbing screenshots (cmd-shift-4 on a mac, spacebar toggles rect/window capture), switching to pages (cmd-tab repeat tab until pages is selected and let go), and pasting the image in (cmd-v) before command tabbing back to where ever the screenshots are coming from.

Recently I’ve been making a few help sheets for glow blogs and on a whim remembered Clarify. I’d tested it but not been impressed for reasons I can’t recall. I got the application through a macheist software bundle a while back. Given I’ve quite a lot of screen-shooting to do I though I’d give it another go. I was quite please to find that I qualified for a free update to Clarify 2 My first impressions of the application have been overturned.

Workflow

Clarify 2 is great for making documents that consist of a series of screenshots and text. The great advantage the application has from a more manual approach is workflow.

  1. You launch the application
  2. Switch to the application you want to explain
  3. Work through the process taking screenshots (cmd-shift-2) as you go.
  4. The screenshots are placed in a clarify document. Clarify stays in the background.
  5. After taking all of the screenshots you can switch to clarify.
  6. Work through the sections, adding titles, descriptions and annotating the images with the built in tools.
  7. Export to word, pdf, html.

You can copy and paste as rtf or publish to WordPress, dropbox or clarify-it.com (the latter is a free beta at the moment).

As you work through the clarify document you can resize the screenshots, annotate them and combine them. The defaults are sensible and the annotation tools are both simple and powerful.

Clarify Interface

The exports can be further enhanced with templates, but I’ve not tried that yet. The publishing to wordpress has worked well in a couple of tests.

To round up, clarify seems to save time buy improving the workflow, decreasing the amount of tinkering and adjusting to be done and exporting to several useful formats. The application costs £18.70 for mac or windows and there is a Mac/Windows Cross-Platform License at £24.94. Well worth the money in my opinion.

I woke up the other morning morning to a bit of serendipity in my RSS reader that cheered me up.
First I read Alan’s great post Don’t Be a Platform Pawn. Next up was Marco Arment Linking and quoting Waffle on Social Media which quoted in turn Community Services which pointed to What’s a Twitter Timeline?. On the back of these posts and more Doug Belshaw posted Twitter, algorithms, and digital dystopias (I got the last link via twitter, but it arrived in my rss reader too).

At the heart of all this the current worries about what you see and who curates your reading. It is also linked in my mind at least, to worries about who owns the space you publish in and the idea around being the product if you are not the customer. It cheers me to see so much pushback against the commercial monoliths.

I’ve read and even posted about this before, as have many others, but it bears rethink or more mulling, it is pertinent again with the redefinition of the twitter timeline and various facebook problems that are popping up.

Doug points out:

they need to provide shareholder value which, given the web’s current dominant revenue model, is predicated on raising advertising dollars. Raising the kind of money they need depends upon user growth, not necessarily upon serving existing users. After all, if they’ve provided the space where all your friends and contacts hang out, you’re kind of locked in.

And we are ‘kind of’, we can also use a mix of tools and spaces and give them up when the discomfort is to great or the utility is poor. Doug has given up RSS in favour of twitter, G+ and facebook. I’ve stuck with it along with scanning twitter (and harvesting links to my RSS reader) and a smidgen of G+. I lack Doug’s guilt at a pile of unread links in my feedreader and I am more than happy to mark all as read now and then.

I think both Alan and Doug would agree that it is ok to use and be used by the silos as long as you are aware and the positives outweigh the negatives?

What is great about Alan’s post is he gives you recipes for how he gains the benefit of flickr, twitter and the like by having control over them, there are a lot of different recipes and links to follow. This presumes that you will use the tools with care, though and a willingness to learn. I’d argue that it is also good fun. here are a few tips of my own.

Know RSS from your elbow

RSS is still useful, an old trailing edge technology I still find my RSS reader better that twitter for finding interesting things to read. Perhaps because things pile up rather than steam by, perhaps because I follow around 2000 folk but have only a couple of hundred feeds or so in my reader.

One of the things I look forward to each week is Doug’s newsletter, Things I We Learned This Week. It is an email list, but I subscribe in my RSS reader, I’ll leave any readers to work out how this is done:-) I’ve also got siftlinks hooked up to my twitter account, this give me a feed of tweets with links from my timeline, it also gives me a feed for my favourites with links. This is great, I use the favourite button in twitter to give feedback to folk (I liked this) and to ‘save’ interesting things. IFTT has several recipies that will convert stuff to RSS so you may find something useful there.

The nice thing about RSS is you can move from laptop to desktop to mobile and keep reading the content. The other major factor for me is how inoreader (web) and FeeddlerPro (iOS) allow me to post links to twitter, tumblr and more importantly to pinboard.

Email is still interesting

I go out of my way to get Doug’s mail in my feed reader because it is content I want to hold onto for a while, but there are an increasing number of email services that provide reading, link or a mix, katexic clippings being a favourite example at the moment. Email lists are also a great way to get information pushed to you from a group.

Play with new things

Along with the old trailing edge technology.

As twitter and facebook and flickr evolve watch out for the new things that are popping up all over the place, I am currently kicking the tyres of Fargo, known and keeping half a eye on Little Facebook Editor. Both known and Little Facebook Editor can post to silos and other spaces, WordPress for LFE and known published to itself and optionally twitter, flickr and Facebook. I am pretty sure that I’ll not adopt these tools for major stuff anytime soon, but it is good to keep up with some different ways of doing thing.

Update, I didn’t post this yesterday because I got distracted by MDwiki, and ended up building a quick test wiki in my dropbox.

I’ve just made my first post on Glow Connect.

Glow Connect is the information portal for Glow – a space for providing updates on the development and enhancement of the service and for sharing how teachers are using Glow. 

This Glow Connect will be a central area for keeping up with glow development.

Here is the contents of the post:

Glow Blogs Update August 2014

I’ve made a few posts over the summer about the Glow Blog migration, which give a bit more detail about what is happening:
Glow Blogs Summer 2014
Blog Migration Notes: Users
Glow Blog Migration Notes: e-Portfolios

This is a further update.

The blog migration project is well underway. There are three or four main chunks of work that need to be completed. The first development, by Code for the People, has been making exceptional progress. The hosting procurement has been completed, removing a bit of worry. Plans for migration of the data from the old servers to the new environment are well under way and some exploratory work is being carried out. Test plans are coming together nicely, and it is great to watch the whole project coming to life.

My role as product owner is making a bit more sense, and it is delightful to work back and forward with the folk building the requirements, developing and getting ready for testing. The attention to detail by the members of the blog team gives me confidence that we are creating a great ‘product’.

There is still a planned content freeze. This will cover the time the data leaves the RM servers and is installed and set running on the new servers. It is expected that this will be around a week. Given that the data being transported is sensitive it will need to be handled with care, encrypted and unencrypted. We are hoping to be able to give plenty of warning around the time of the content freeze. We also hope to have a plugin in place in the existing blogs that will add a message about this to the dashboard of every blog.

The continues to be a risk that the migration will not be complete before the old servers are turned off on the 3rd of October which could result in some downtime, however we are managing this risk very closely.

In summary, we’re making good progress and I will keep you updated on Glow Connect.

I’ve been posting some glow blogs information here, so in the future I’ll probably cross post in the two places.

I am not sure where I saw this technique mentioned first, it might have been: Build cheap panning camera mounts for time lapse photography, but there are plenty of other links: stop motion pano ikea timer – Google Search

Pretty simple idea, you use a cheap ikea kitchen timer with some stop motion app, I used iMotion HD.

The above is not a very long one, the midges made it pretty short. Here is the setup:

IMG_5156

As part of the week 1 of P2PU Why Open? course participants were invited to join David Wiley on a google hangout (Why Open session with David Wiley). I could not make the live stream so have just finished watching the archive.

I’ve also posted the audio ripped from the session, with permission, over at EDUtalk.cc (As usual I find audio easier to access than video).

These are a few of the things that I found interesting in the hangout, not in any order and very much my own interpretation.

David is the founder of OpenContent.org, among many other things and a expert on open content and open educational resources.

Throughout the talk David focused on the pragmatic rather than the idealistic, on what would make an impact over what was right or righteous.

He started talking about the difference between the Free Software movement and the Open source movement, and how Richard Stallmam’s Four Freedoms inspired all the openness that followed. David’s view is that Open is less do do with correctness & morality and more practical.

David say Free involves a bit of moral grandstanding, giving no place for proprietary software. Open says open is practical and we can choose not to be open which is not morally bad.

The other side of the argument is laid out here: Why Open Source Misses the Point of Free Software – GNU Project – Free Software Foundation and The Four Freedoms from Matt Mullenweg are worth considering from the pov of the practicality of Free Software.

David talked of the Berne Convention that in 1886 changed the face of copyright:

Under the Convention, copyrights for are automatically in force upon their creation without being asserted or declared. An author need not “register” or “apply for” a copyright in countries adhering to the Convention. As soon as a work is “fixed”, that is, written or recorded on some physical medium, its author is automatically entitled to all copyrights in the work and to any derivative works, unless and until the author explicitly disclaims them or until the copyright expires.

from: Berne Convention – Wikipedia.

This switch the default from you are ok to copy to you are not. If you publish something you need to legally state wish to share. This leads to existing material with unknown copyright is not being published.

David works with the Open Content Definition which uses the 5 Rs the rights to Retain, Reuse, Revise, Remix and Redistribute. The Retain right has been added to address, and highlight, the problem of business models that control access, eg, stream media.

David gave a few examples of the practicality of open, obviously OERs are cheaper than textbooks, but a major gain in moving towards open might be in higher where the movement to competency based course is slowly gaining ground. The argument is that these competencies are slow to develop and the process could be speeded up by opening the competencies. Open assessments would be another area to explore.

Someone in the hangout expressed the worry that publishing in the open would be less useful from an employability angle than publishing to well established (and paywalled) publication. David discussed the idea of the impact of publishing on the open demonstrating with Google Scholar the number of times open publications are cited as compared to paywalled ones. Publishing in open will maximise the number of people reading so giving a better chance of making an impact. Again the outcome was more important than the philosophy.

Exploring another tension, that of Open vs Connected, David proposed that connected is a vice when taken to extremes. For example there are now so many resources in google it is hard to identify the best resources. More nodes and connections becomes noise at some point. Curation and structure needed on top of connected. Curation is biggest value that faculty brings to learning. Neither open or connected should not be their own end.

My own practise of working/blogging/learning in the open is based on a fairly fluffy feel good factor. I’ve found over the years that this has had a positive effect on myself and learners I work with. I covered this in a previous post. I now feel that it will be important to start to try and bring openness more formally into my day to day work and made small steps in that direction today. Instead of looking for good vibes I’ll be trying to introduce open where it can make an impact.

A lot of educational research, and I am going to choose my words carefully here, was utterly guff, was utterly, utterly guff, by that I mean, was complete speculation, rhetoric or opinion dressed up as science.

Tom Bennett talking on Radio #EDUtalk.

Radio Edutalk got off to a flying start last night with a great show. I am sure if the fact I was not near a mic had anything to do with this;-)

David talked to Tom Bennett about research in education. Stirring stuff, I nodded along to the trashing of Brain Gym and the like and the podcast gave lots of food for thoughts. A couple of places I really wished I had been near a mic:
One round Tom’s idea that teachers should not be researchers. We have talked to a fair number of folk doing action research on Edutalk and I think their experience is valuable?

The other was picked up from a couple of different sections of the podcast, in one Tom talked about the conferences he is organising being a place teacher could work things out from themselves away from the influence of councils (I am paraphrasing here). Later he suggested that chains of Academies, were big enough to carry out scientific research. My Local Authority hat wanted to asks if he would consider LAs suitable bodies to organise research perhaps in conjunction with nearby Universities. I guess I am knee jerking against Academy chains and there is possibly Tom is not as aware as the Scottish system of Local Authorities.

Lost Puppy

Lost Puppy flickr image by Tim Shields Creative Commons – CC BY-NC 2.0

Here are some notes around the effect that the glow blog migration will have on e-portfolios hosted on the WordPress instances. There are two main things to consider, users/members of blogs and links to access blogs.

User Management Issues

We are currently migrating the blogs, including e-portfolio blogs, to a new WordPress Server initially running the same version of WordPress 2.9.2

The set up will mirror the existing set up, an instance for each Local Authority. The URLs will stay the same.
The permission on the blogs and their private/glow only/ public setting will stay the same.
Only users who have previously visited the blogs while logged onto glow will have the same access.

Going forward the settings for blogs will be handled in the blog rather than Sharepoint. This is partially because the new Sharepoint is cloud based and cannot be customised in the same way as the old glow portal and partially because of the advantages of having blogs stand on their own two feet (see below for details).

Users who were granted roles on a blog though the old glow group in which the blog was created, but have never visited the blog will have to be added again. (There is no trace of these users in WordPress and therefore no way of migrating them). You can mitigate against this providing problems by asking those users to visit the blogs before the switchover. After migration Admin Users will be able to add these (or any other) users in in the blog admin dashboard.

I’ve covered more about User Management in a previous post: Blog Migration Notes: Users.

Finding your e-Portfolio

If users followed the advice on setting up glow blogs as e-Portfolios they would have created a group in the old glow portal to hold links to groups of e-portfolios. After the switch over to the new authentication in early October (3rd) the old portal will not be there.

There will be a need for teacher and pupils to be able to access blogs there are a few possibilities in the short term.

  • Lists created before the portal was migrated to O365 will still be there in your O365 group.They will be buried in the migrated content but can be resurrected.
  • You could copy and paste the list from old glow and paste into a new glow group.
  • You could recreate the lists. This would be my favoured option as I would distribute the work to pupils.

For the third approach you would need to create a space when pupils could add their e-portfolio URLs. This could be a links list in O365 or a share word document in OneDrive. Pupils would need to have permissions set so that they could add to the links or have edit permission on the document.

As a teacher I would not use links like this, too much work. I would bookmark them in my browser and put all the bookmarks in a folder. I could then log on to glow and open the folder of links in tabs. This would after a wee wait give me a set of tabs that I could quickly go through to visit each portfolio without clicks.

Post Migration Development

After migration it intended to move into phase 2 which will be an upgrade of the blog software from 2.9.2 to 3.9

Phase 3 will include improvements to the service, adding plugins and themes to increase functionality.

In Phase 3 it is the intention to improve the setup of e-portfolio blogs by improving the setup, improving how posts are organised and how the profiles are produced from those posts.
It hoped that this work should be completed by March 2015

WordPress only

The main advantages with moving the set up of blogs and e-portfolios away from Sharepoint to the WordPress server itself will be:

  • We will be able to keep the WordPress install a lot more up to date. This will allow users to benefit from new features as these are added to WordPress. With the old glow blogs it seems that it was to hard to upgrade WordPress.
  • The setup of e-portfolios will be shortened. currently I find it takes around an hour to go through the setup with a class of pupils assuming each pupil has access to a computer/device. Removing the Sharepoint element of the blogs will speed things up in the short term.
  • In the medium term the new blogs should let us develop new functionality which will speed things up even more and reduce the opportunities for making mistakes.
  • We should also be able to develop the portfolio functionality of the blogs though plugins. This could make the organisation of posts and production of the profile snapshot simpler.

Please get in touch if you have any questions about the glow blog migration.

flickr photo by krystian_o Attribution License

43/365 by krystian_o Attribution License

TL:DR I’ve just migrated my blog, please let me know if you find broken things and I’ll try and fix them.

In the midst of the glowblogs migration project I am involved in professional I’ve been working on a wee migration of my own. For the past nine years I’ve been blogging using pivot (later pivotx) this weekend I’ve moved to WordPress.

Why?

I started using pivot back in 2004 as my class blog mainy because it did not need a database on the website and then that cost a bit more. I stuck to it as I found it easy to theme, and adapt for various classroom projects. It seemed fairly natural to use the same system myself.

Pivotx seems to be changing but quite slowly, the promise of pivot 4 was from 2012.

I’ve been attracted to several interesting WordPress technologies and plugins and now use it for edutalk, ScotEduBlogs and my ds106 blog. The feedwordpress plugin is of especial interest.

A wish to eat my own dog food given I am promoting WordPress for glow.

How

Originally I though of turning the whole site/domain to WordPress, to include my ds106 blog: 106 drop in, but that looked a wee bit to tricky at the moment. I also have a bunch of straightforward html pages and experiments which I want to leave in place. Also there are a few challenges to moving the pivot posts to WordPress that seemed enough for now.

There is not a simple pivotx imported for wordpress, I found Migrating your blog from PivotX to WordPress | filmvanalledag which looked as if it was a near fit, but missed out tags and comments. I’ve also been using disqus comments for my blog but wanted to move to standard ones without losing disqus.

That filmvanalledag post gave me a great start with example.org/?feed=rss&c=*&n=10000 I used this to download the rss feed for all of my posts (>800).

I decided that the RSS import would lose all my tags and comments so went for another approach. I have a bit of experience with kludging together a standard WordPress import from other things. This is probably of little interest to anyone but myself, but briefly I use SuperCard to creat a simple pseudo database of the rss add in the missing keywords by downloading them directly from the database and then manipulate it into a wordpress friendly format, for example getting the tag list like this:

uid,"tag","contenttype",target_uid
2533,"assessment_is_for_learning","entry",1132
2535,"blogging","entry",1133
2537,"scotedublogs","entry",1135
2536,"newyear","entry",1135

Directly from the database and turning it into:

<category domain="post_tag" nicename="glowscotland"><![CDATA[glowscotland]]></category>
<category domain="post_tag" nicename="glowscot"><![CDATA[glowscot]]></category>
<category domain="post_tag" nicename="blogging"><![CDATA[blogging]]></category>
<category domain="post_tag" nicename="wordpress"><![CDATA[wordpress]]></category>
<category domain="category" nicename="wwwd"><![CDATA[wwwd]]></category>
<category domain="category" nicename="jj"><![CDATA[jj]]></category>

For each post.

I tested the import of my export on a local version of wordpress and also worked out some htaccess stuff.
My old blog links were like this: http://johnjohnston.info/blog/?e=2462 and wordpress expects this http://johnjohnston.info/blog/?p=2462.

It took me a fair bit of googling and testing to get something that worked, although not quite correctly, http://johnjohnston.info/?e=2464 once I turned on pretty links goes to http://johnjohnston.info/blag/what-is-openness/?e=2464, not removing the ?e=2464, ?p=2464 works properly. Once tested I created a new WordPress on the site, at /blag, set it up, did a little more testing and have just moved it to /blog after changing the urls in the General Settings Screen.

The other main problem was that I was using disqus for my comments on the old blog and being a packrat I did not want to leave them behind. So I’ve added a bit of logic to this new blog where older posts will display the disqus form and any comments but going forward I’ll use the standard WordPress ones. I can’t see any way to import the old disqus comments into WordPress at this point.

There are more things to fix, and I’ll try to pick these up as I go along. After that I’ll be looking to play around with some wordpress plugins and the like.