Grass in the foreground 2 blurred people behind

My last day of term plan yesterday was a scavenger hunt, Capture the Flag and tidy our MakerSpace.

The scavenger hunt was made in Keynote, a slide for each challenge.

I’ve recently read Digital Scavenger Hunts – DigiLearnFalk which shows how to make really attractive one-slide digital hunts in keynote. Using place holders. They have even some nice templates to use.

My own approach is much less attractive. I wanted quite a long list of ‘things’ to find and wanted to add audio to the video/photo mix. The result is much uglier, but only takes a couple of minutes to make.

I write (or copy from a list or lists found online ) & edit the list in a text editor. Then copy it.

  1. Open Keynote & make a new Presentation.
  2. Create a title slide.
  3. Make a new slide and set the from a to section heading
  4. Change the view to outline and paste in my list. This makes a slide for each line.
  5. Select all the new slides and choose Reapply Layout to Slide from the Format menu.
  6. Edit the Slide Layout and move the section heading to the top.
Keynote Scavenger Hunt – No Audio

I use Keynote every day in my class. Perhaps unfortunately for my class, I spend very little time designing slides. I mainly choose the default black on white theme. I try to follow the advice of Robert Macmillan and keep my slides simple.

The class didn’t have time on Wednesday to do more than make the slides and then we reviewed them together. Given more time, it could have been a fun task for the pupils to make the slides look good.

I just saw What is Digital Literacy? A guest blog from Andy McLaughlin, University of Aberdeen – DigiLearn and a pointer to the discussion on LinkedIn.

But what do we do there – if we go open source or other methods are we giving our young people the skills to enter the workplace .

Ian Gibson

Ian and John, I’d love to hear your take on the idea that Big Tech’s “efficiency agenda” has been the biggest hindrance to digital skill development.

Andy McLaughlin

There is a lot of interesting ideas popping up in the conversation. I joined in, although I don’t really have a clear idea as to where I stand. Quite quickly I reached LinkedIn’s maximum character limit, so though I’d post here and link in there, POSSE style. Here are the rather ragged thoughts I wanted to post:

Of course in Scotland we have access to an Open Source product in the form of WordPress 😉 But I doubt there is much awareness of Open Source generally among my colleagues. As a primary teacher, I need to get my head round hundreds of experiences and outcomes, leaving little time for the reading, never mind the thinking needed in this area.

Open Source is involved in many work places. Some even owned by ‘Big Tech’. Unfortunately Open Source and open technologies (RSS for example) do not have an army of paid and unpaid evangelists in the same way as ‘Big Tech’.

I am not suggesting we should abandon Big Tech, but we should be able to think about the implication

I recently quoted this:

warning parents that although they think they are giving their children access to the internet, they are really giving the internet access to their children.

BBC World Service – The Documentary Podcast, Assignment: Ireland’s phone-free town

Could we replace parents by educators, children by pupils and internet by ‘Big Tech’.

Not sure I fully grok Big Tech’s “efficiency agenda” but to my mind it might be jumping into using tech too far from the base metal? Just a few (20) years ago, I’d start teaching pupils some basic text editing, a wee bit about the difference between bitmap and vector image software before moving on to more complex tools. I think I’d rather see a pupil ‘misusing’ powerPoint or Keynote to make their own creative images than cycling through possibilities in a more sophisticated tool.

I am also open to the idea that a bit of friction in your toolkit might mean to spending a bit more time thinking.

Bookmarked 20 outdoor bird games and activities (creativestarlearning.co.uk)
When learning about nature and wildlife, experiential outdoor sessions need to happen. We need children to play and to practice key skills so that observing and understanding birds is a key focus of our time. Over the decades , a number of outdoor educators such as Joseph Cornell, Steve Van Matre, T...

This and Creating Bird Friendly Schools and Nurseries | Creative STAR Learning | I’m a teacher, get me OUTSIDE here!

A gif of the micro:bit simulator

I just found this post in my drafts, last edit September 2023, the bug mentioned is still there and I think it is worth remembering the creativity shown by my class.

For the past couple of weeks, we have been working on a micro:bit project in class. One of the interesting aspects of working with a diverse group is the need to introduce new ideas and topics to the class, especially when some students already have experience in the area. This year, my class is Primary 5, 6, and 7. The Primary 7s have already had some exposure to micro:bit and other block coding environments.

I decided to focus on the virtual pets project from the makecode site for our first project. In my classroom, we use the micro:bit app on our iPads for coding, and it and the micro:bits has had a couple of useful improvements. Since the last session, the ability to download data to the iPad has been introduced, although we haven’t explored it yet, but I hope to do so soon. Another improvement is a simpler method for resetting the micro:bit when connecting via Bluetooth, making it more accessible for smaller fingers.

I introduced the topic by discussing Tamagotchi pets, which I remember being popular in my class over two decades ago. The virtual pet project is a lot simpler in comparison, with just two features: when a pet is stroked, it smiles, and when it’s shaken, it frowns.

The limited functionality left plenty of room for exploration for the 5 primary 7s with experience. Leaving plenty of time for me to make sure the others managed the basics, linking to. Micro:bit flashing code etc. I did drop the words food and health into the conversation but didn’t go any further.

We ended up having three sessions with the micro:bits and I was delighted with the results. Between them the sevens recalled variables from last year and were off. We ended up with pets needing fed to avoid death, being sick if fed too much and getting annoyed if they were petted too much. The younger pupils managed the basics and extended them in simpler ways, animating chewing and drinking or reacting to different buttons.
One primary six who had previous experience did just as well as the primary sevens, his pet had these features:

  • Sleep
  • Be happy if stroked (press logo)
  • Be sad if shaken
  • Die (wait long enough and don’t feed it)
  • Be sick if fed to much(can be cured using b)
  • Be scared (by making a noise/blowing/filling the red bar 180+)
  • Be fed (using A)
  • Get a health check (A+B)
micro pets on the Banton Biggies

The first three were part of the class instructions, taken from the make code site, the rest were pupil ideas.

Of course lots of mistakes were made along the way, but it was great to see solutions worked out, shared with neighbours and lights go on. Quite a few pupils used wee bits of free time to explore and test ideas completely independently.

On Friday we went to post on our e-portfolios about the work and embed your pets in the blogs. Editing the shortcode to do this is a bit tricky and we also ran into a problem with the simulator not embedding properly. Half the micro:bits were hidden. At first I thought this would be a problem with Glow Blogs, but later investigation showed it to be a problem with the makecode code. We workaround it by embedding the editor rather than the simulator. I do hope the makecode folk sort this out. Since it affects their documentation too I expect they will.

I continue to be a micro:bit fan and will be using them throughout the year, hopefully incorporating it into our makerspace projects too.

Webpage with description and image of a European Goldfinch on a branch against a blue sky.

Image Accessibility Creator · Streamlit

Impressive ai generated descriptions and alt text from images. I found adding even a one word helped.

Via Doug Belshaw: How to easily generate image descriptions and alt text | Thought Shrapnel

After taking a screenshot I fed it back to the webpage (too tempting) and got:

Image Description: A screenshot of a webpage with the header “asuo-ai-labs.streamlit.app” shown in the top center. The screen is divided into two main sections. On the left, under the heading “Image and Details,” there is a section for uploading an image. A placeholder box labeled “Drag and drop file here” or “Browse files” is visible. Below it, there is a thumbnail of an image labeled “IMG_9865.jpeg” which is 2.8 MB in size. The image thumbnail shows a European Goldfinch perched on a budding branch against a clear blue sky. Below the thumbnail, there is a toggle switch labeled “Add details about the image,” which is turned on. A text box is filled in with the word “Goldfinch.”

On the right side of the webpage, there is a section labeled “Image Description.” Below it, a detailed description of the image is provided, as well as a short alt text. The image description reads: “The image features a European Goldfinch perched on a budding branch against a backdrop of a clear blue sky. The Goldfinch displays vibrant plumage with a distinctive red face, a mix of black and white on its head, a warm beige chest, and brownish wings with yellow accents. The branch, adorned with small, white buds and green leaves, stretches horizontally across the bottom right of the image, while the bird is slightly right of center. The serene sky comprises soft, light blue gradients without any visible clouds.”

The alt-text reads: “European Goldfinch perched on a budding branch against a clear blue sky.”

Alt-Text: Webpage with description and image of a European Goldfinch on a branch against a blue sky.

AI in action

In the vast sea of educational blogs, The Learning Hub shines as a beacon of insightful content and valuable resources. With a dedication to excellence and a passion for learning, this blog has become a go-to destination for educators, students, and parents alike.

 

This was posted in response to a post on the Glow Blog help: Glow Blogs Start of Session FAQ 2022.

I’ve been generating a bit of text for example sites recently and this text has the same smell I get from ChatGPT and co-pilot.

Read Generative AI and Creative Learning: Concerns, Opportunities, and Choices by Mitchel Resnick
As each new wave of technology ripples through society, we need to decide if and how to integrate the technology into our learning environments. That was true with personal computers, then with the internet, and now with generative AI technologies.

I just listened to the generated audio rather than read this.

Really powerful summary between the instructionist and constructionist approaches to AI in education. Resnick is of course the father of scratch, so is firmly on the constructionist side.

There are powerful ideas and examples of the ways AI could support a constructionist approach to learning and the 4Ps projects, passion, peers, and play.

I started to pull out quotes, but it easier to suggest you just read the whole thing.

 I worry that inertia and market pressures will push the educational uses of generative AI in this direction.

This would be the worry.

The piece finishes with

The choice is up to us. The choice is more educational and political than technological. What types of learning and education do we want for our children, our schools, and our society? All of us—as teachers, parents, school administrators, designers, developers, researchers, policymakers—need to consider our values and visions for learning and education, and make choices that align with our values and visions. It is up to us.

I do wonder if, in the mainstream, we have much choice. I don’t think that many decisions about educational technology have been very pure, the power of the big companies is massive. We should be thankful that the more open, non-commercial like scratch exists.

My class joined in the ‘ AI Wonderland: Unleash Creativity with Make it hAPPen (P4-P7)’ webinar on Monday. It was a useful introduction for their age group on a topic we had not explored in class. In Teams I noticed this TeachMeet1 too. I finally signed up for it on Wednesday.

Given it started at 3:30 on Thursday and school finished for the easter holidays at 2:30, it was a bit of a rush.

I had planned, the night before, to talk a bit about using ChatGPT for creating H5P content in Glow Blogs. I knocked up a quick keynote of screenshots to avoid the danger of live.

ChatGPT can quickly produce information which, once checked, can be used to create H5P content. What is especially useful is that it can format the information to work with HP5 textual inputs. I’ve put some instructions on the Glow Blogs H5P examples site.

The TeachMeet was quite quiet, 3:30 on the last day of term was probably tricky for most folk. I enjoyed the other things shared, although I didn’t grab any links, except for Diffit. I hope to get the rest when the recording is released.

Most of the sharing mirrored mine in that they involved creating resources, quizzes and the like. One idea that stood out, and one I intend to use, was taking an interesting phrase from pupils’ writing and using it as an image prompt in Bing (I believe). This was demonstrated to the whole class and sounds like it would generate interesting discussions.

I’ve used some of the free AI tools, mostly ChatGPT, for a while now. Mostly for simple text generation and some JavaScript or AppleScript help. I don’t doubt that, despite some glitches, that it is potential useful and interesting.

Is that an Elephant?

There are a lot of difficult and awkward questions around the use of GPT in teaching & learning. I’ve read a fair bit of discussion around the ethics at both ends of the process, but not much discussing the primary school level.

Things that worry me, beyond my knowledge, time, brain power or pay grade2:

  • The obvious, ethics around where the data comes from, scraping possible copyrighted works.
  • The bias of the data, racial bias is the one I’ve read about most, but I imagine there are many others.
  • Possible breaches of pupil/student data, safeguarding issues.
  • The commercial nature of the tools. A lot of these services seem to be freemium, with either a limited or time limited resource set.

I’ll keep using AI in a casual way with minimum risk (I hope), but it feels like education is stepping into a can of worms in the same sort of way we have adopted most technology, in a rather haphazard way.

Feature image is an old gif I made from a public domain photo a few years ago.

  1. I’ve not been to a TeachMeet for a while so this intrigued me as much as AI ↩︎
  2. Over the time it has taken me to type this post I see this: Women’s faces stolen for AI ads selling ED pills and praising Putin – The Washington Post, this AI – two reports reveal a massive enterprise pause over security and ethics. I also asked ChatGPT to give me 400 words on the pros and cons of using AI in education. I’d say there is a lot of confusion about. ↩︎